Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
View attachment 912855
Are we not going to discuss how much BS this is? Sure, the cars don’t crash on FSD beta. But they sure would if people didn’t disengage before! I love my FSD beta car but lying to a bunch of investors about its quality is absurd. I have full confidence that, if left alone with no one in the driver’s seat, FSD beta would crash within 20 miles of driving. I know most everyone here would agree.

It’s like Tesla wants to be sued. I just don’t get it.
Thats not how it works.

There are two systems now
- FSDb
- Human supervision

Tesla is saying together they are better than human only. You are making a tangential argument that FSDb alone is worse than human only. Duh.
 
View attachment 912855
Are we not going to discuss how much BS this is? Sure, the cars don’t crash on FSD beta. But they sure would if people didn’t disengage before! I love my FSD beta car but lying to a bunch of investors about its quality is absurd. I have full confidence that, if left alone with no one in the driver’s seat, FSD beta would crash within 20 miles of driving. I know most everyone here would agree.

It’s like Tesla wants to be sued. I just don’t get it.
I’m not saying this is the case, however:

If you had reduction in crash rate in identical situations, even if it meant interventions on the part of the driver from time to time, that would be a win.

That is what they are claiming (however the cohorts and the driving scenarios are very different, which is the problem with their comparison).
 
Almost all of my disconnects are due to the car being too cautious and hesitating when it could clearly proceed. Very seldom do I disconnect because the car is in danger of crashing into something.

Glad to hear that you’ve having such a good experience with current build.

I love beta and have enjoyed the ride since the first public release and don’t want ant to stop using it.

That said even the current version it’s plenty dangerous for me.

Going straight though intersections that are right turns only.

Speeding up and running stop signs.

Confusing bike lane with the only single real lane of traffic and swerving between and finally a dead stop with traffic behind.

Turning left into oncoming traffic on simply two lane cross traffic four way stops when there is island separating them.

Last minute changes across two lanes to make a left turn.

All of these would have likely resulted in an accident given the kind of traffic around at the time. Certainly not every time; but it have made these make these same mistakes numerious times given

I can go on and on. But I’m sure this already sounds like complying when it not.

And I won’t want someone to ask me to upload video to provide that I’m having these issues because they can’t take my word for it.
 
  • Like
Reactions: JB47394
Glad to hear that you’ve having such a good experience with current build.

I love beta and have enjoyed the ride since the first public release and don’t want ant to stop using it.

That said even the current version it’s plenty dangerous for me.

Going straight though intersections that are right turns only.

Speeding up and running stop signs.

Confusing bike lane with the only single real lane of traffic and swerving between and finally a dead stop with traffic behind.

Turning left into oncoming traffic on simply two lane cross traffic four way stops when there is island separating them.

Last minute changes across two lanes to make a left turn.

All of these would have likely resulted in an accident given the kind of traffic around at the time. Certainly not every time; but it have made these make these same mistakes numerious times given

I can go on and on. But I’m sure this already sounds like complying when it not.

And I won’t want someone to ask me to upload video to provide that I’m having these issues because they can’t take my word for it.
I'm sorry that you got one of the bad cars.
 
Service has replaced my cameras (on one car) and done a full recalibration twice. Pretty sure it’s not a bad car. Especially since I have two of them from separate years and they both do the same thing.

But thanks for the kind words.Appreciated
 
View attachment 912855
Are we not going to discuss how much BS this is? Sure, the cars don’t crash on FSD beta. But they sure would if people didn’t disengage before! I love my FSD beta car but lying to a bunch of investors about its quality is absurd. I have full confidence that, if left alone with no one in the driver’s seat, FSD beta would crash within 20 miles of driving. I know most everyone here would agree.

It’s like Tesla wants to be sued. I just don’t get it.
Sure, but as long as FSDb is L2, the driver is part of the system. It might make more sense to say "Drivers using Tesla FSDb" drive more miles per collision than unassisted drivers.
 
  • Like
Reactions: GSP and JB47394
Thats not how it works.

There are two systems now
- FSDb
- Human supervision

Tesla is saying together they are better than human only. You are making a tangential argument that FSDb alone is worse than human only. Duh.
This pretty much describes my way of using FSDb now.

I drive far more than I used to, less close call situations - partially because I do disengage when I feel I need to. And I am less tired than I used to be after driving.

I can't say I gained nothing from FSDb.
 
  • Like
Reactions: GSP
This pretty much describes my way of using FSDb now.

I drive far more than I used to, less close call situations - partially because I do disengage when I feel I need to. And I am less tired than I used to be after driving.

I can't say I gained nothing from FSDb.
Yea, I feel the same.
My disengagements are typical just being alarmed that the car wants to do something differently than I perhaps would, and my brain goes "whoa there big fella". But most of the time it is giving me useful feedback that I appreciate and makes my drive more enjoyable.
 
  • Like
Reactions: willow_hiller
I’m not saying this is the case, however:

If you had reduction in crash rate in identical situations, even if it meant interventions on the part of the driver from time to time, that would be a win.

That is what they are claiming (however the cohorts and the driving scenarios are very different, which is the problem with their comparison).

The Tesla tweet on this talks about "airbags-deployed" crashes, which could filter out a lot of minor collisions. FSDb-assisted drivers could have more fender-benders than unaided-drivers without impacting that statistic. But even so, a reduction of serious crashes seems like a win.
 
The Tesla tweet on this talks about "airbags-deployed" crashes, which could filter out a lot of minor collisions. FSDb-assisted drivers could have more fender-benders than unaided-drivers without impacting that statistic. But even so, a reduction of serious crashes seems like a win.
Why would you think that a reduction of serious crashes would be accompanied by an increase in minor ones? If the car mitigates serious crashes to the point that they become so minor that airbags do not deploy, then it follows that minor crashes would tend to be avoided altogether.
 
  • Like
Reactions: EVNow
Why would you think that a reduction of serious crashes would be accompanied by an increase in minor ones? If the car mitigates serious crashes to the point that they become so minor that airbags do not deploy, then it follows that minor crashes would tend to be avoided altogether.
I didn't say that. I said that _if_ there was an increase of minor crashes, it would not show up in the statistic. I have hopes that that is not the case--but even if it is, in my opinion a reduction in major crashes would outweigh it.
 
I didn't say that. I said that _if_ there was an increase of minor crashes, it would not show up in the statistic. I have hopes that that is not the case--but even if it is, in my opinion a reduction in major crashes would outweigh it.
But, what would lead to an increase in minor crashes? You have a hypothesis, but put forth no evidence or plausible explanation for it. It's pure speculation.
 
But, what would lead to an increase in minor crashes? You have a hypothesis, but put forth no evidence or plausible explanation for it. It's pure speculation.
I have no hypotheses. My point was that the statistic as described is not sensitive to the signal of a change in the number of minor crashes. Not that I think there would be more. Or less for that matter. I can speculate either way, but cannot prove it one way or the other from the statistic that was offered.
 
Service has replaced my cameras (on one car) and done a full recalibration twice. Pretty sure it’s not a bad car. Especially since I have two of them from separate years and they both do the same thing.

But thanks for the kind words.Appreciated
Map quality is the biggest factor for most of my disengagements and I wonder if that contributes to your bad experience? That may help explain why there is such variation on people's perception of how well FSD does. It sure would be helpful if Tesla could tell us if there is any way the beta testers could help with this issue. If not tell us that.

Last night I had to drive thru an old "mill" town near me and the drive was much worse than I normally see. The problems mostly centered on mapping issues.
 
  • Like
Reactions: JB47394 and EVNow
Map quality is the biggest factor for most of my disengagements and I wonder if that contributes to your bad experience?
It's either map quality or that language model treatment they have for lane selection (which may suffer from GIGO). There's a 45 mph road near me with an exit lane into an industrial park. Exit lanes are common on that road and the car drives past them without a hitch. But if I so much as turn on cruise control, my car will go into that particular exit lane for no apparent reason. It doesn't slow for the exit turn, and I have to quickly take over, so I don't get a chance to find out what the car's next move would be.

It's just one of those "out of the blue" things that my car does quite predictably, just like changing lanes multiple times away from a coming turn. But only at specific intersections.
 

Tesla seems to report 3 different numbers for "US Average" crashes with 0.5M miles at Investor Day, 0.6M in the tweet, and 0.7M (652k) for Vehicle Safety Report. Maybe the tweet reflects the numbers for 2023 Q1, so looking at the latest Autopilot data for Q1, that's 6.57M miles per crash in 2022.

EPA fuel economy uses a weighting of 55% city driving to 45% highway driving, so directly applying that ratio assuming FSD Beta 11.x is matching existing Autopilot's highway safety and FSD Beta's 3.2M per crash has been city streets so far, that would suggest somewhere around 4.7M miles per crash as people get 11.x or basically an order of magnitude over the 0.5M US Average number from the slide.

Of course, actual usage of FSD Beta will greatly affect the numbers just like how Autopilot usage makes it somewhat misleading to directly compare to US Average. Notably, people argued that Autopilot was mostly easy miles on highway whereas the average included city streets, but on the flip side, FSD Beta has only been city streets, so with the release of 11.x, it should be able to accumulate these "easy" miles much faster?
 
FSD Beta has only been city streets, so with the release of 11.x, it should be able to accumulate these "easy" miles much faster?
Hopefully they’ll keep the breakdown separate, since the scenarios are so different, and easily partitioned.

But I assume they will just do whatever makes things look the best. It will be annoying and unclear, that is certain.

They really need to compare to TACC-only use in identical scenarios, though maybe the data is too sparse for TACC only.
 
Last edited:
I made a mention I’ve heard others state several times in the hundreds of beta videos.
So you're repeating rumor and hearsay without bothering to confirm or verify it first, yet you get annoyed when someone called you out and actually asked for a source?
Otherwise I claim you’re just looking to be argumentative. As for FOS thanks for the coutious conversation.
🤷‍♂️not sure how asking you to verify a claim is being argumentative.
Besides which someone posted a link already.
The link didn't apply.
And it’s also off topic.
Just responding to your claim.
 
But if they dial it back, there would be more cashing scenarios. Reducing false positives tends to increase false negatives.
That assumes a system with static abilities. The entire point of the development is to allow fewer false positives and fewer false negatives.

Thats not how it works.

There are two systems now
- FSDb
- Human supervision

Tesla is saying together they are better than human only. You are making a tangential argument that FSDb alone is worse than human only. Duh.
What about autopilot?
 
That assumes a system with static abilities. The entire point of the development is to allow fewer false positives and fewer false negatives.
Right - I’m talking about a given system. It’s obviously lot more difficult to reduce both - but remains the objective with every release.


What about autopilot?
Not sure I understand the question…