Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
The recall is a done deal. You can't "test" a recall on some customers and then implement it. NHTSA was clear and no new FSD Beta installed software until the recall was meet. There may be other reasons but the recall CAN'T be one of them.

View attachment 923120
Please, anyone correct me. Didn't Tesla put some self imposed deadline of 3/31
for the recall? Wouldn't that, if true, mean the big push is coming in the next day or so?

Damn, whenever I say push, I think of salt n pepa!
 
I mean, they have done some incredibly complex and amazing things, so additional algorithm here seems fairly straightforward. Admittedly it does not play to their strengths, since it involves speed changes.
Since none of us work for Tesla, and more specifically in their AI/FSD Beta coding department, we cannot say for sure how easy or hard something is to code. The only thing we can do is guess. In this case, I'd use Occam's Razor - all things being equal, the simplest answer tends to be the right one. The fact that we've not seen these "easy" changes implemented yet would indicate they are not easy to implement. Another possibility is that they are on a project list, with a low priority and will be implemented at some point in the future as they work through higher priority items.
 
Do we know what FSD Beta 11.x does if 10.x would have fallen back to legacy Autopilot stack?

10.x unavailable.jpg


Here 10.69.25.2 allows activating Autosteer with Traffic Light and Stop Sign Control with a red warning of "Full Self-Driving unavailable." It also from time to time showed a yellow warning that the left repeater needed clearing (here covered by snow).

I have a feeling that FSD Beta 11.3.3 will just not allow activating Autopilot, but potentially one could trick it by having a profile with "Autopilot Features" set to "Autosteer (Beta)" to activate Autopilot then switch profiles to one configured to "Full Self-Driving (Beta)."
 
Another possibility is that they are on a project list, with a low priority and will be implemented at some point in the future as they work through higher priority items.
Yeah it is probably this.

They very likely work off a Pareto when it comes to prioritizing feature improvements and additions.

I would guess piling into stationary objects and stopped objects remains a higher priority.

In terms of ease of implementation they seem to be able to maintain distance from a vehicle in front so they seem to have the technology. Obviously there’s a decision and planning tree here too which makes it significantly more complex than that.

But eventually this will be high on the Pareto and then they will need to do it.

I don’t think anyone is taking issue with this being important though. After all, it is part of The Fundamentals of Driving 101!

Based on Chuck’s videos it doesn’t look like they do anything with this. 😭
 
Last edited:
How is it doing with blind spots when driving with traffic at a similar speed? I guess probably not a big enough freeway to properly test - two lanes it is usually less of an issue (you are either passing or following). At 4 or 5 lanes it starts to be the norm that you are driving close to the same speed as adjacent vehicles.

This is a key usability issue for in-city travel (less of an issue for long-distance travel due to above) so hopefully they have addressed it, since it is an easy one.
The highway I was on was mostly 3 lanes. I didn't notice any problems related to blind spots but frankly wasn't watching too much for that. I switched between keeping minimal lane changes on and off just to see how that setting affected lane changes. It worked properly. I did notice on the 75 mile return trip the added space given to large trucks moved me too close IMO to the car in the next lane. NoA always moved over some for trucks so I'm not sure why V11 needed to add more space.
 
Since none of us work for Tesla, and more specifically in their AI/FSD Beta coding department, we cannot say for sure how easy or hard something is to code. The only thing we can do is guess. In this case, I'd use Occam's Razor - all things being equal, the simplest answer tends to be the right one. The fact that we've not seen these "easy" changes implemented yet would indicate they are not easy to implement. Another possibility is that they are on a project list, with a low priority and will be implemented at some point in the future as they work through higher priority items.
Absolutely.

Some people (not OP) tend to assume the team they are talking about - and doesn't do what they want them to do - is stupid etc. That is why they can't do these "simple" things.

I think a far more truthful thing is that in a company like Tesla that is always in the top 5 places of most desirable places for engineers to work at (along with SpaceX) - there are valid reasons why the team is not doing something.
- Its not easy
- EM told them not to do it
- Its not a priority

I always assume the teams (or for that matter companies or countries or political campaigns) are staffed by motivated, intelligent and diligent people. Now they may not always be successful or do the "right" thing on the whole. But it won't be because of stupidity.
 
Bingo!
- Its not a priority

People often blame software developers for being "lazy" or "sloppy" when in reality their problem is with the the prioritization and investment decisions the company has made.

For FSD, Tesla most likely has a huge mountain of potential bugs or improvements to work on, much more than they can do all at once, so definitionally most get deprioritized.

They've also got a ton of data we don't see that could be leading them to work on some specific rare, but critical, safety related issues that the average FSD tester never encounters. With something like 50 million miles of testing done, they've had a few airbag-deploy-severity crashes (based on their metrics for miles per crash) and many more near misses. All of these are potentially situations so rare the average driver would never see them but could make sense for Tesla to prioritize. Averting a serious FSD incident is more important than minor complaints we testers encounter every day.

Plus, we know Tesla is testing FSD internationally already. That's consuming at least some of their time. And they're likely investing in other work that won't be seen by us for a while either (example: V11 single stack was in internal testing for like a year before we saw it!).

TLDR: All of this adds up to the externally visible prioritization decision, and progress made, on the project appears much more disjointed and poorly planned than it actually is. Tesla probably has sophisticated data driven planning and prioritization processes that we can only speculate about. This post was obviously speculative, but this is my best effort tea-leaf reading based on 6+ years working in software product development.
 
Bingo!


People often blame software developers for being "lazy" or "sloppy" when in reality their problem is with the the prioritization and investment decisions the company has made.

For FSD, Tesla most likely has a huge mountain of potential bugs or improvements to work on, much more than they can do all at once, so definitionally most get deprioritized.

They've also got a ton of data we don't see that could be leading them to work on some specific rare, but critical, safety related issues that the average FSD tester never encounters. With something like 50 million miles of testing done, they've had a few airbag-deploy-severity crashes (based on their metrics for miles per crash) and many more near misses. All of these are potentially situations so rare the average driver would never see them but could make sense for Tesla to prioritize. Averting a serious FSD incident is more important than minor complaints we testers encounter every day.

Plus, we know Tesla is testing FSD internationally already. That's consuming at least some of their time. And they're likely investing in other work that won't be seen by us for a while either (example: V11 single stack was in internal testing for like a year before we saw it!).

TLDR: All of this adds up to the externally visible prioritization decision, and progress made, on the project appears much more disjointed and poorly planned than it actually is. Tesla probably has sophisticated data driven planning and prioritization processes that we can only speculate about. This post was obviously speculative, but this is my best effort tea-leaf reading based on 6+ years working in software product development.
Maybe, but perhaps it is as it appears to be, as well…
 
Last edited:
  • Like
Reactions: FSDtester#1
Today was the first time I did my regular shopping trip after upgrading to 11.3.3, from 10.69.25.2. It was also the first time it made the trip in both directions without disengagement, even with poor GPS!

Improvements:
  • Now correctly handles poor GPS accuracy (See thread: FSD + GPS (in)accuracy = fail)
  • Made a smooth left turn at an intersection it would normally come to a dead stop in the road. This particular intersection is in the middle of a sharp bend, which always confused FSDb in the past.
  • It then proceeded down this narrow road, with cars on either side, slower than it normally would. Most people go slow down this road despite the speed limit being higher.
  • Turned into the shopping center road slower, and proceeded slower after turning in. In the past it would go way too fast.
  • It only phantom indicated at 2 different bends in the road, it normally indicates at about 5 different bends.
Did not notice any regressions. So overall more positive experience.

I also noticed it turned the overhead pedestrian crossing lights blue. In the past it ignored them completely. None of the lights flashed on this trip, so interested to see what it does when they do flash.
 
Yes, and then needed a slow pull out into the intersection. Waiting 2 secs at a stop sign is more tolerable if FSDb avoids those darn intersection crawls. Hopefully they didn't move the crawl feature after the stop and into the intersection.
Roundabouts with nobody in them are fine. Have to try them at 3pm when school lets out before giving a thumbs up. Some odd slow downs with no apparent reason but generally confident action everywhere. Ego blew a lane selection but there was no signage and painted arrows were obscured by other cars. this is 2022.45.12 / V11.3.3.
 
  • Like
Reactions: kabin
Roundabouts with nobody in them are fine. Have to try them at 3pm when school lets out before giving a thumbs up. Some odd slow downs with no apparent reason but generally confident action everywhere. Ego blew a lane selection but there was no signage and painted arrows were obscured by other cars. this is 2022.45.12 / V11.3.3.
I've also noticed some performance degradation based on the roundabout's size (radius), number of lanes, and the number of exits.
 
  • Like
Reactions: jebinc
Took a 300 mile round trip drive today, 80/20 highway/city. Observations:
- lane changes were very smooth and use of blinkers excellent. Love how it slows a bit if required so a vehicle can move past then immediately blinks and changes lanes, perfect
- tended to stay left too much, I waited then manually blinked to move right several times.
- moved well to the left of the lane while passing semi’s and large trucks, well done and just fine with me
- “Nags” - someone complained they were terrible so I tested. Happened to me twice and only when I looked down at the bottom of the screen for 10 seconds (yes, timed it). If my eyes glanced up at all during those 10 seconds no nags. Looked left or right to test and no nags, only when looking down and only then if 10 seconds without glancing up at the road. Seems very reasonable and if you’re not watching the road too bad IMHO, you’re the people that make this an issue.
- following distance is too far even on “aggressive” setting, needs to be 30 to 50 percent less. People cut in on busy freeways as the gap is too large. Started manually accelerating to shorten the gaps.
- Zero “phantom braking” events! A couple of minor slow downs for undetermined reasons, nothing really. The real test will be long stretches of desert highways on a long trip in May, fingers crossed.
- disengaged a couple of times when confused at intersections, will test more

Overall a very pleasant experience which could be easily made even better with minor adjustments. Getting intersections down will take more work but I’m mainly concerned with highway driving so I can wait. Interested to see what 11.3.4 fixes!
Forgot to add, had fairly heavy rain for 30 minutes or so, got the warning popup but wasn’t an issue.
 
Last edited:
following distance is too far even on “aggressive” setting, needs to be 30 to 50 percent less. People cut in on busy freeways as the gap is too large. Started manually accelerating to shorten the gaps.
People always say this, but this is a good thing. It is exactly what is supposed to happen. People cut in and then you keep falling back. It works really well and feels comfortable and relaxed. Everyone wins. There is literally zero downside!

I would be very surprised if the following distance is too relaxed on aggressive. But it would be great if true.
 
People always say this, but this is a good thing. It is exactly what is supposed to happen. People cut in and then you keep falling back. It works really well and feels comfortable and relaxed. Everyone wins. There is literally zero downside!

I would be very surprised if the following distance is too relaxed on aggressive. But it would be great if true.
We’ll just have to disagree on that, I see no reason for it. Not relaxing for me for sure. Check back after you’ve tried the update.