Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
In this particular case, the traffic lights are awfully hung. I could easily see human drivers being confused about whether the light is red or green:

View attachment 850411
Yea, if coming to this intersection for the first time, I would have to study that for a few seconds to figure out what is going on..... while local people behind me are honking for me to get out of the way.

EDIT: Oh and I wonder what happens when the wind starts swinging those lights around :oops:
 
Did they suggest sativa hemp oil?🤣🤣
😁 I wish they did. There are usually really smart guys down at our local auto parts store, BUT a high school kid on a Saturday looked out the Window and asked if that was my car. That is when I said yes and asked what blend oil he would recommend (as I wink and smile). He took me seriously and said... "Uh I don't know. What did the car originally have in it after service?" That's when I knew I should not have joked about it. Not wanting to embarrass him by saying it was a joke, I just said I don't know. I'll ask Tesla. Great price on wheel cleaner which is what I actually went in there for.
 
In terms of the disengagement metrics, I assume disengagement data used for something like regulatory approval would need to be provided in detail and likely broken down into multiple categories each with thresholds that need to be satisfied. There is no way that just a high level total disengagements per mile would be sufficient for the reasons we're all aware of, unless the roll out is geofenced to specific conditions.

Not only can roads and conditions be classified, but they could probably even report on disengagements (or lack thereof) when the system is attempting specific types of maneuvers categorized by risk level and such. If you look at questions the regulators are asking, all of this is clearly understood and they expect granularity from Tesla

Like these questions asked in the Phantom Braking investigation

1662742302532.png


Part of the OEDR is classifying road types and scenarios, determining what maneuvers have what risk level based on conditions, etc. I've noticed FSDBeta will sometimes ask for steering wheel input when it's approaching what I would deem a higher-risk scenario, like it's checking to make sure you're watching closely right before it tries something wild
 
I took another trip in our town, this time, with my wife. First time for her on this version.

Amazingly, zero disengagements! (Now I'm sounding like Mars... 😁)

This version is WAY better than the previous version in many ways.

There were 4 places where I always navigate manually. Today, all of them were successful.

- Often, if there is one car in front and we are stuck in traffic on one particular spot, it used to decide all of a sudden that we are on right-only lane and switch on right turn signal. Didn't happen today. Doesn't mean it may not happen but a good start.

- Ugly left turn that failed yesterday turned ok. Basically it just ran over a yellow crossed out area to get to the left turn lane instead of doing a wild S turn. All humans here do this instead of S. Could be a fluke.

- On another left turn, on the opposing side, one car came straight while the other car did his left turn. FSDb correctly waited and went for the left turn just as I would. She said it was as smooth as my normal drive.

- That odd driving behaviour last night at 2-stops away from my house didn't happen today. It handled the situation correctly.

The biggest complement of all - my wife (not a techie) said she couldn't tell about 70% of the time that the FSDb was driving. And this is with zero disengagement and no manual accel push.
 
There is no way that just a high level total disengagements per mile would be sufficient for the reasons we're all aware of, unless the roll out is geofenced to specific conditions.

Not only can roads and conditions be classified, but they could probably even report on disengagements (or lack thereof) when the system is attempting specific types of maneuvers categorized by risk level and such. If you look at questions the regulators are asking, all of this is clearly understood and they expect granularity from Tesla
They are two different things.

They are asking - what type of "testing" has Tesla performed. Regulators are old school - they don't understand stats. They think if you "test" once that is sufficient. They just list a few different scenarios. They don't understand hundreds of thousands of variations that happen in real life. That is also the reason that level "3" standard specifies what tests to do for "5 minutes".

To correctly categorize disengagements in detail (like Waymo or Cruise would do), the total number of disengagements have to be low. SO, when you have a few dozen vehicles, you an do it. I can do detailed analysis for my car. Tesla, with 100,000 testers can't do detailed analysis. They have to do mostly automated analysis.
 
They are two different things.

They are asking - what type of "testing" has Tesla performed. Regulators are old school - they don't understand stats. They think if you "test" once that is sufficient. They just list a few different scenarios. They don't understand hundreds of thousands of variations that happen in real life. That is also the reason that level "3" standard specifies what tests to do for "5 minutes".

To correctly categorize disengagements in detail (like Waymo or Cruise would do), the total number of disengagements have to be low. SO, when you have a few dozen vehicles, you an do it. I can do detailed analysis for my car. Tesla, with 100,000 testers can't do detailed analysis. They have to do mostly automated analysis.
I think the regulators are working hard to catch up, problem is that the few autonomy experts who exist will likely be more inclined to join a company like Tesla and get stock options etc rather than get on board with a government body that is probably quite a bit less lucrative

But if we look at something like the Autopilot crash per mile stats that Tesla publishes, last month the NHTSA had started asking questions like this

1662747290105.png


And then I noticed this at the bottom of that page on the Tesla site, it might have always been there but I think it may have been added after questions were asked

1662747490071.png
 
I think the regulators are working hard to catch up, problem is that the few autonomy experts who exist will likely be more inclined to join a company like Tesla and get stock options etc rather than get on board with a government body that is probably quite a bit less lucrative
You don't have to be an expert on autonomy. Just well informed - anyone should be able to get well informed in a week.

Oh BTW, didn't they appoint some kind of professor who claimed to be an expert (and thoroughly anti-Tesla) as a consultant? Any number of experts are available for consultation.
 
Just staring at the TeslaFi firmware page... Thought I'd pull out some interesting stats.

Among HW3 vehicles on TeslaFi:

- 16.95% are on an FSD Beta version less than 10.69
- 4.45% are on FSD Beta 10.69 or 10.69.1.1 (appears that 10.69.1 has been completely moved over to 10.69.1.1)
- 37.35% are on 2022.24 or higher
- And 38.64% are on a non-FSD 2022.20 branch

It's curious that there are still so many FSD Beta drivers on 10.12 and below, and so many non-FSD Beta drivers on 2022.20. I wonder if people in the FSD Beta queue presently on 2022.20 are being held back from 2022.24+ until after 10.69.2.
 
You don't have to be an expert on autonomy. Just well informed - anyone should be able to get well informed in a week.

Oh BTW, didn't they appoint some kind of professor who claimed to be an expert (and thoroughly anti-Tesla) as a consultant? Any number of experts are available for consultation.
I feel like you might be underappreciating what goes into this, there are probably people who spend their entire careers just researching road design and traffic-related stuff
 
I think the regulators are working hard to catch up, problem is that the few autonomy experts who exist will likely be more inclined to join a company like Tesla and get stock options etc rather than get on board with a government body that is probably quite a bit less lucrative

But if we look at something like the Autopilot crash per mile stats that Tesla publishes, last month the NHTSA had started asking questions like this

View attachment 850720

And then I noticed this at the bottom of that page on the Tesla site, it might have always been there but I think it may have been added after questions were asked

View attachment 850722
If I were the NHTSA I would ask Tesla for crash rate per mile for TACC only. :p
 
I feel like I have done my part to properly describe the behaviors of FSD Beta 10.69.

Prediction: I think it will provide noticeable improvements in behavior to users, specifically:
1) Turning smoothness
2) Unmarked streets that are wide enough (correct positioning and behavior around corners).

Otherwise I predict people will largely have most of the same complaints as before (lane selection, left turns, right turns, stopping, going, etc.). I predict people will generally not be impressed by ULT performance.

Anyway it is definitely a step forward in spite of all this, and I hope everyone gets it soon, and that against all odds, they have somehow resolved a bunch of issues in 10.69.2, and I am completely wrong.
 
In your opinion, good enough to drop the Safety Score requirement? Or still best to keep it gated?
I think it should definitely be kept gated for now. It can still do very incorrect things frequently, and I’m not talking about comfort issues.

I have no idea when it will be good enough for that sort of issue to not be a concern.
 
  • Helpful
Reactions: willow_hiller
In your opinion, good enough to drop the Safety Score requirement? Or still best to keep it gated?
I still think they should keep it gated. There have definitely been significant improvements in FSDb but there are still significant safety issues that need to be resolved before a wider, ungated release. I’m optimistic that these issues can be addressed, although not as quickly as Elon says. (Is anyone that optimistic?)

Remember - when Autopilot was released we had idiots climbing out of the driver’s seat to take a nap. There is no advantage to Tesla releasing FSDb at this point and a lot of downsides in all areas. I say this as someone who is not a perpetual FSDb pessimist, like @AlanSubie4Life , nor a Tesla fanboy optimist.