Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Here is FSD Beta making a random turn off a main road with no destination set. It happened in the first 2 minutes. This is the behavior I had.

A Tesla with FSDb, just like a horse, always has a default destination of preference. Human driving of a Tesla, or a horse for that matter, is just an override of those preferences. With FSDb engaged and no destination assigned, you have left those preferences available in the background. So be prepared, that or tell the car where it can go before departure. I find it also helps if I talk to my Tesla gently before a drive, brush it down, give it some water, maybe some apple music, and let it know everything is just fine.......But today you ain't gonna visit that model S you got your camera eye on, no way, so forget that........
 
Last edited:
I have a question about FSD and the Nav maps. Typically I set my work destination when just outside my house. before we had the option to pick the proper route I used to add a Starbucks as a stop then delete it once I passed the turn for the wrong way to work. I may or may not actually go to Starbucks.

This morning I added Starbucks to the route. I had already ordered coffee with the app. I go to add it and walla it and another store I go to are at the top of the list? I haven't been there for a month. Now what really got weird was I went downtown 20-30 miles away. I'm heading back to the office and decide to order a salad with my Wegmans app. I order the salad, and go to add the stop when I'm twenty miles away and look what's on the top of the list Wegman's. Starbucks is gone.

Do we have Apple and Tesla spying on us knowing since that app was recently used there is a probability that I'll be heading to that store? It was to easy to add those destinations.
I had exactly the same thing happen to me with this version of software. Eventually, I had to take a job at Starbucks to get the software to operate correctly. So you may want to try that........
 
Yeah pretty common.

Practically speaking not an issue since it’s so rare that things would ever get to this point, anyway. (I virtually never have FSD engaged when I get up to a red light.)

So some risk, but not a lot. One of the reasons FSDb is so safe!
I’ve seen people complain about it rubbing red lights - what I found interesting here was it was actually fabricating a green light.

(Looks like the upload didn’t work last time so here’s the picture of the display again showing the red light and the fabricated green light.)
FAD5B245-8C78-4C05-A6E8-F0268AB0AD23.jpeg
 
I have a route I drive almost daily and every time FSD is upgraded the end of the route (the final turn into the destination) changes. I thought this was due to the software change with the new FSD but today another family member drove the car and turned off the FSD, so when I got in I had to turn it on and the final turn into my destination had changed again. So maybe I was incorrect about the cause of the changing route.
Anyway I am very pleased with how well the auto high beam now works.
 
  • Funny
Reactions: pilotSteve
I’ve seen people complain about it rubbing red lights - what I found interesting here was it was actually fabricating a green light.
Perhaps AI systems sometime just make stuff up.

Today I played with OpenAI's chatGPT and asked how well Tesla FSD works. At one point it said that Musk tweeted on 10/12/2020 "Tesla’s FSD is not a fully self-driving system and still requires the driver to pay attention at all times." This struck me as not something Musk would say, so I searched and was not able to find such a tweet. Later on I asked ChatGPT if it had given me that quote, and it said No. It revised it's quote to say FSD was not fully autonomous. When I asked, it also explained that "full self-driving" and autonomous were not the same thing.

So not only can an AI imagine things, it can lie to cover-up it's errors. Creepy. Too much like some people...
 
Perhaps AI systems sometime just make stuff up.

Today I played with OpenAI's chatGPT and asked how well Tesla FSD works. At one point it said that Musk tweeted on 10/12/2020 "Tesla’s FSD is not a fully self-driving system and still requires the driver to pay attention at all times." This struck me as not something Musk would say, so I searched and was not able to find such a tweet. Later on I asked ChatGPT if it had given me that quote, and it said No. It revised it's quote to say FSD was not fully autonomous. When I asked, it also explained that "full self-driving" and autonomous were not the same thing.

So not only can an AI imagine things, it can lie to cover-up it's errors. Creepy. Too much like some people...
I have caught ChatGPT in many factual errors. It's a good demonstration tool but relying on it for information could be embarrassing.
 
  • Like
Reactions: pilotSteve
Interesting, 2022.44.200 reported on TeslaFi. Car's previous version was 25.2.
TeslaFi.com Firmware Tracker

Interesting previous version for 2022.44.100, including one vehicle downgraded from 2023.2.10.

Hmmmm. I've always been suspicious of the "impossible to downgrade" party line.

The previous version on the 2022.44.200 was the 2022.40.30 5 version.

Screenshot_20230211-220716_Free Adblocker Browser.jpg
 
  • Informative
Reactions: FSDtester#1
I don't think Tesla has any idea what you have ordered, but they do keep track of which places you have navigated to in the past, and put those toward the top of the list. I'm not sure what all they track, but they for sure track which day of the week, and maybe the time.
I deliberately did things different this weekend, there doesn't appear to be any talk or data running between Apps. I was wrong, Tesla is doing as you say running time of day and our habits. As the header "suggested" it's watching my FSD habits and creating the pattern. During the week I get one group depending on the time of day. on the weekends I run a totally different pattern and the suggestions changed with it.

It certainly makes adding a destination to the route a lot easier.

I wonder if my FSD computer thinks I'm Boring? :rolleyes:
 
  • Informative
Reactions: old pilot
I wonder if my FSD computer thinks I'm Boring? :rolleyes:
Oh of course not. Let me explain AI to you. The AI computer is completely non-judgmental about your human tastes. It doesn't think you're boring, it only thinks you're predictable.

Meaning that when the singularity comes, a minimal number of neural network computing cycles will need to be assigned to enslave you and run your life.
 
I had a pretty good drive with FSD beta last night. It seems to alternate between good and terrifying every session -- maybe it's my expectations going in?

FSD Ramble:

This is all without traffic as I do it late at night, but it really had some impressive stretches where I once again thought this thing might actually work someday. I think one of the big issues is how every place chooses to mark it's lanes and then GPS issues. I've taken the same route three times and there are a couple of consistent points of failure that I'm curious to see if they'll ever get solved. One is a right turn where the closest lane is actually a bus lane. Once it turns into the bus lane it finds itself with a solid lane line to its left so I think it's kind of boxed in the wrong lane. I'm not really sure how the car could handle this unless it can use GPS data to avoid this in the future.

The next big issue is that it tries to make a left turn too early to reach my destination. This is kind of a bad one as it gets in the turn lane, starts making the left turn and then seemingly realizes its mistake and kind of freaks out. It feels to me like it's going to go right into oncoming traffic but I've been too chicken to not take over to find out exactly what it will do. Turn lanes in general seem to give FSD problems as a lot of them here have solid lane lines so that if the car is late at all to get in the turn lane it won't cross the solid line and then attempts to make the turn from the wrong lane (does this consistently on one turn on my way to work)

The other consistent failure is when driving in the right lane of a highway and there's an exit where there is no dotted lines demarking the exit lane. It veers to the right and then jerks back to the left once the exit is passed. Not the end of the world, but you look like a moron. Neither theHyundai or the ID4 does this on this particular highway, they just track the left side of the lane marker.

I wish there was a way to more easily guide FSD when it's making mistakes. I do get why the steering has to be more firm than other systems because it actually makes full turns, but it would be nice if there was a mode for highways with corrective steering that more closely mimicked the systems of Hyundai and the ID4. Both of those are so much less stressful to use on highways and we got the ID4 after the Tesla so it's not just a matter of being more used to the other systems (capacitor steering wheel is so nice on the ID4).
 
I don’t care about my disengagement rate. I don’t use beta because it can’t even do basic things as drive straight and comfortably stop behind cars and red lights. For a while, when we were able to report stuff, I thought that maybe Tesla cares about our feedback and is working on improvements. As the time went on, a year later, I realize the beta is really just to shut us up. They don’t care so much that they even took the report button away.
 
At this point disengagement rates are useful when we look at order of magnitude rather than small differences. All we can say is the disengagement rates is around 1 in 10 miles i.e. it is more like 1 in 10 miles rather than 1 in 100 miles or 1 per mile.

Yesterday I went to a frequent shopping place - but took a different route (thanks to alt routes). In the first 3 miles that route has 6 roundabouts. Disengaged in 3 of them. At the end there was some construction that forced everyone on to left turning lane - but you'd just go straight at the traffic light from the turning lane. Obviously FSDb was confused. So, 4 disengagements in that 10 mile drive.
 
At this point disengagement rates are useful when we look at order of magnitude rather than small differences. All we can say is the disengagement rates is around 1 in 10 miles i.e. it is more like 1 in 10 miles rather than 1 in 100 miles or 1 per mile.

Yesterday I went to a frequent shopping place - but took a different route (thanks to alt routes). In the first 3 miles that route has 6 roundabouts. Disengaged in 3 of them. At the end there was some construction that forced everyone on to left turning lane - but you'd just go straight at the traffic light from the turning lane. Obviously FSDb was confused. So, 4 disengagements in that 10 mile drive.
Yes current average is ~1 disengagement per mile but a lot of this is due to drives like the example you gave.

1 drive 20 miles 0 disengagements and 1 drive 10 miles 3 disengagements
=3 disengagements per 30 miles (1 disengagement per 10 miles)

Little issues are holding back collective progress.