Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Pressing the accelerator is a mitigation, not a solution.
‘just pressing the accelerator’ is far more significant intervention than most people realize. Fundamentally, at an intersection everything you are evaluating is whether you need to stop or whether it’s safe to go. When you press the accelerator you’ve done all the work. The only thing the car needs to do is steer. So when you say “I just pressed the accelerator,” recognize that you just did all the work that FSD is supposed to do.
 
Just a minor mention of something in the release notes, to you all:
---
- Increased recall of forking lanes by 36% by having topological tokens participate in the attention operations of the autoregressive decoder and by increasing the loss applied to fork tokens during training.
---
OK. This is something that I've been half-expecting for a time. "Increased recall of forking lanes" implies that there's some actual memory picked up from driving around in the FSD-B.

Ever since I learned about neural networks, one of the Standard Things in the architecture of neural networks is that can have, potentially, feedback that changes the weights in the neural net.

Some time back on another forum, far away, when people were arguing about how well the auto-function on the windshield wipers were doing, Tesla introduced this, "hit the button on the end of the stalk if you want it to start" and implied that this was being fed back, somehow, to a Better Windshield Wiper Auto Function. Since anybody can say anything on the interwebs, I stated that I thought that this might be actual feedback into the local neural network that was doing the water-on-windshield shtick. Others, naturally, disagreed, and, since Tesla never reveals this kind of stuff, the argument went where all such arguments about Tesla software go: Nowhere.

Maybe you all had spotted this business about "recall of forking lanes" in a previous release, but I sure haven't. The implication is that the neural network in the Tesla (or some really interesting database) is collecting information about missed turns with the intent of improving them over time. And it doesn't necessarily appear that this information is being sent to Tesla, but is kept locally on the car. Perhaps sent to the mothership from time to time to improve overall fleet performance? Maybe it's linked to the maps database?

In any case: Interesting.

Wonder if this means that a given car's performance will be better on the Nth time through an intersection as compared to the first.

Hm. And might explain why, in the previous release, when my M3 had kept on trying to smacker itself against the end of a Jersey Barrier in a particular construction zone, it stopped Doing That after a map database update.

The plot thickens.
 
‘just pressing the accelerator’ is far more significant intervention than most people realize. Fundamentally, at an intersection everything you are evaluating is whether you need to stop or whether it’s safe to go. When you press the accelerator you’ve done all the work. The only thing the car needs to do is steer. So when you say “I just pressed the accelerator,” recognize that you just did all the work that FSD is supposed to do.
Yes, it's wonderful. Also direct feedback to Tesla, marking the exact point where FSD was in error, as has been mentioned, and the input can be used for training. I'm surprised they don't actively encourage it.
 
Last edited:
  • Like
Reactions: FSDtester#1
Maybe you all had spotted this business about "recall of forking lanes" in a previous release
"Recall" has been used in plenty of release notes including 10.69's "recall of animals," "recall of forking lanes" and "recall of object detection" and previous version's "recall of crossing lanes," "recall of far-away crossing objects" and "recall for vehicles directly behind ego." But yes this is the first time specifically about "forking lanes" I believe.

This is reporting that false negatives have been reduced, so that the network predicts something when it is actually that something. Whereas "precision" is reducing false positives, so that the network doesn't predict "phantoms" when the thing isn't actually there.

The way Tesla calculates these recall numbers is most likely from video snapshots sent back and turned into training data and measuring how many previously "missed" predictions are now correctly predicted.

Specifically for forking lanes, this means 10.69 should correctly identify a fork when previous versions might have thought it was the same lane or a continuation of an adjacent lane.
 
Not sure if this will load, but this most recent update is the first time I have made it through this particular intersection without a intervention.



08B7109C-474F-4F3D-80F3-854C18FD1E44.png
 
Had .2 drive me home from work today. Definitely a more refined drive. I took over once due to a janky situation of a guy taking a left in front of me from our superwide lane, normally traffic just flows around cars doing that and I didn’t want FSD to stick me behind him and then have traffic route around me, so I took over.

But much smoother in general. I feel like I am able to leave it on during most turns without inconveniencing cars around me.
 
I did repeated testing this afternoon at a highly occluded unprotected right turn that beta has never been able to do safely. I've taken this turn successfully with 10.69.1.1 and 10.69.2 a couple times, but this is the first rigorous test I've done.

The intersection is from a neighborhood street onto a six-lane divided artery road with a median blocking any left turn. On the left is a large wall resulting from widening this road some years ago, which creates a highly-occluded view of the right-most cross lane. Realistically, there should be a traffic light at this intersection, but, we must play the cards we are dealt. So...

This turn has some precise creep requirements in order for the car to see well to the left. To evaluate this, I set up a GoPro camera just above the left side B-pillar camera so that I could review approximately what the car can see. I had previously reviewed dashcam video and confirmed that the car does not position itself correctly to see cross traffic on the left repeater camera. Unfortunately, Tesla does not make the B-pillar camera video available, so I use the GoPro as a substitute.

The results were as I expected. The car is able to safely make the turn. It establishes a perfect creep wall and when it creeps up to it, the B-pillar camera can see far enough on the left to detect cars that would be a hazard. However, there are two issues with how this turn is made. First is that the car will usually not proceed unless all three lanes appear clear of traffic. This is an unreasonable gap requirement as the cross street is often quite busy. Traffic in the left-most lane is generally no danger and the Tesla could turn onto the road with that traffic present.

The second issue is the car does not commit to the turn very well. It slowly eases into the right cross lane and only after it is fully blocking the lane does it accelerate away. Once the car has occupied the lane, there is no benefit sitting there. It needs to commit.

Here's a view of the GoPro B-pillar surrogate. You can see how well the creep works. It would actually be better if the car angled itself more to the right so the cross road enters the left repeater camera's FOV. IT is possible to do that within the creep limits. But, this works as well.

1663020659614.png
 
‘just pressing the accelerator’ is far more significant intervention than most people realize. Fundamentally, at an intersection everything you are evaluating is whether you need to stop or whether it’s safe to go. When you press the accelerator you’ve done all the work. The only thing the car needs to do is steer. So when you say “I just pressed the accelerator,” recognize that you just did all the work that FSD is supposed to do.
How are you sure the system is "working" (aka processing very slowly) vs. incorrectly determining that it's not safe to go? If you provide no feedback Tesla will assume that it was not in fact safe to go and that will cause training errors.
 
  • Like
Reactions: AlanSubie4Life
If your lucky…I’ve been on 2022.20.8 for about a month. 10.69.1/2 is on the .20 branch, so I had hope I might get back in. Nope. Safety score is 100 for the last 8,847miles. Should have just kept my car that had the beta on it. Nothing like buying a new car, only to loose functionality. Thanks, Tesla.
Yes, it seems that buying another Tesla makes you a new Tesla owner, regardless of how long you have been driving a Tesla, or whether or not you previously had FSDb. I will probably do what I did last time, which is turn off my FSDb request so I can get regular software updates, and make the request again when there is widely available version and I am about to go on a long trip. It worked the last time with my M3
 
I am pretty sure recall is being used in this sense: Precision and recall - Wikipedia
Exactly. The use of the word "recall" to mean something quite different than "memory retrieval" threw me for a loop when I first saw it. I don't particularly like this use of the word, but it's now well established in the jargon of machine petception so we just have to get used to it.

This recently showed up in many of the tweets by the Waymo engineer Warren Craddock, and re-posted by @diplomat33 in the Waymo thread. He said that Lidar has nearly "100% recall", not meaning that it's remembering anything, but that it reliably sees occupancy of nearly all objects in the scanned landscape.

So unfortunately, there's no indication of any local memory reinforcement being applied in the current Tesla FSD architecture. But because of the arcane use of the word "recall' in the public release notes, this impression is likely to be picked up in the user/enthusiast discussions.

And then, someday when they do start to add self learning memory of the local driving environment, everyone will have to be careful not to use the word "recall" when discussing it!
 
  • Informative
Reactions: nvx1977
How are you sure the system is "working" (aka processing very slowly) vs. incorrectly determining that it's not safe to go? If you provide no feedback Tesla will assume that it was not in fact safe to go and that will cause training errors.
I don't think so. SleepyDoc might have a point. From what I am understanding from folks that actually work on this stuff, there is not necessarily a need for a user to provide feedback in order for the car to report interventions such as pressing the accelerator or brake while on FSDb. This prompts me to ask more questions about that the next time I am out there.
 
Did some additional drives hoping I would see an improvement after the poor drives of this morning. Unfortunately all my drives this afternoon were no better. Just lots of jerkiness and more disengagements than normal. Just hoping 69.2 improves over the next few days otherwise I may need to reduce the percentage I use FSD. (upgraded from 10.12.2)
 
I don't think so. SleepyDoc might have a point. From what I am understanding from folks that actually work on this stuff, there is not necessarily a need for a user to provide feedback in order for the car to report interventions such as pressing the accelerator or brake while on FSDb. This prompts me to ask more questions about that the next time I am out there.
I think you might have made a typo somewhere in there...
The only feedback the user can provide is accelerator input, disengagements, and the report button. Or are you saying that the car can detect when it doing something wrong without any user intervention?
 
Did some additional drives hoping I would see an improvement after the poor drives of this morning. Unfortunately all my drives this afternoon were no better. Just lots of jerkiness and more disengagements than normal. Just hoping 69.2 improves over the next few days otherwise I may need to reduce the percentage I use FSD. (upgraded from 10.12.2)

Did you try it in the same situations where you've used 10.12.2? Or are you using around pedestrians? I feel like it's noticeably smoother back to back on the same type of low traffic suburban roads i drive. I had a couple of wonky behaviors around pedestrians, but seems like that is to be expected
 
The left turns I experienced today chose a much better trajectory on both marked and unmarked roads. FSDb historically loves to cut corners in the turn, which can be awkward when a car is approaching on the road I'm turning onto. 10.69.2 made some turns almost with L-shaped trajectories. A few times I thought the car was just going to go forward instead of turning, but then it turns quite sharply. A bit of an overcompensation, but still, I prefer this over cutting the corner (which humans do all the time, and it annoys me).
 
No FSD Beta for me, but I was just pushed 2022.20.9, oddly enough. So I think Tesla is deliberately keeping people in the Beta Queue back from 2022.24.

Possible you've mistaken 2022.20.9 for 2022.20.15 @TheProton ?
Absolutely was on the .16 release, I am sure as that is what stopped me fro getting FSD the last time, and changed my update preference from advanced to standard so I didn’t get ahead of FSD again
 
First drive with 10.69.2 today. Most of the same issues (staying in lane on curvy two-lane road, school zones, etc.). Didn't get to test the "go around parked car" bug. But now as I approach a left turn with a short left turn lane from a three-lane highway, the car wants to work right at least one lane. Tried about 1 mile out, then again about 1/2 mile out, and then again about 1000 yards out. I cancelled the lane change every time but maybe tomorrow I will let it change lanes and see what the hell it was going to do to make the left turn. Still doesn't engage the left turn signal until it's already slowed down, entered the left turn lane, and almost come to a stop. I guess people in California are a hell of a lot more patient than drivers in Atlanta.
 
Absolutely was on the .16 release, I am sure as that is what stopped me fro getting FSD the last time, and changed my update preference from advanced to standard so I didn’t get ahead of FSD again
And I now have FSD after 9 months of trying. Verified it is 20.15 in the app and on my first drive. I believe the teslafi stat showed one vehicle making the same upgrade. Was not me, I don’t subscribe, but I’m not the only one it would seem…
 
  • Like
Reactions: willow_hiller