Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
I often see school zone signs that say "Speed Limit 20 when children are present". That's a tough one to solve.
It is indeed. There are two simple rules that could help:

1) If the time is between 7AM and 5PM, set the speed limit to the sign. Doesn't matter if there is school going or not. Better to be safe than sorry.
2) If there are any pedestrians visible anywhere in range of the cameras, set the speed limit to the sign.

The downside is that people will complain when their car slows down when school is not in session. Or people will complain when their car slows down because it detected a pedestrian out for a walk near a school.

You can't win the school zone scenario. Someone is going to be upset - either people will be upset that the car doesn't slow down in school zones, or people will be upset that their car IS slowing down in school zones because it saw someone walking their dog and therefore lowered the speed because "children were present".

If your next argument is that the car should know the difference between an adult and a child, I'll point out that many teenagers are the same size as adults.

So - the car is damned if it does, and damned if it doesn't. Instead - it leaves it up to the driver's discretion to determine if the car should slow down by using the wheel to lower the speed if you know you're going into a school zone and there are/will be children present. It's a driver assist feature - it's assisting you, but you're still in control.
 
I could see Tesla just avoiding school zones mapping wise if the signage solution is challenging. Shouldn’t be too much of a time deviation in many cases.

Additionally, there are quite a few occasions now where I haven’t seen a pedestrian but the car has whether it’s dark or the streets are super busy. The car should be more responsive than a human for these scenarios.
 
It is indeed. There are two simple rules that could help:

1) If the time is between 7AM and 5PM, set the speed limit to the sign. Doesn't matter if there is school going or not. Better to be safe than sorry.
2) If there are any pedestrians visible anywhere in range of the cameras, set the speed limit to the sign.

The downside is that people will complain when their car slows down when school is not in session. Or people will complain when their car slows down because it detected a pedestrian out for a walk near a school.

You can't win the school zone scenario. Someone is going to be upset - either people will be upset that the car doesn't slow down in school zones, or people will be upset that their car IS slowing down in school zones because it saw someone walking their dog and therefore lowered the speed because "children were present".

If your next argument is that the car should know the difference between an adult and a child, I'll point out that many teenagers are the same size as adults.

So - the car is damned if it does, and damned if it doesn't. Instead - it leaves it up to the driver's discretion to determine if the car should slow down by using the wheel to lower the speed if you know you're going into a school zone and there are/will be children present. It's a driver assist feature - it's assisting you, but you're still in control.
If they can't get it from reading the sign or from map data, it's probably best to just have it always be the posted limit and let the driver override it. I wouldn't want it looking to see if there are people because that wouldn't tell you the law. You could end up speeding through a school zone.
 
If they can't get it from reading the sign or from map data, it's probably best to just have it always be the posted limit and let the driver override it. I wouldn't want it looking to see if there are people because that wouldn't tell you the law. You could end up speeding through a school zone.
I agree - however there will be people who will complain loudly about that solution too.
 
I often see school zone signs that say "Speed Limit 20 when children are present". That's a tough one to solve.
There's probably a broad set of traffic rules that are more necessary because humans can be bad at paying attention and slow to react, so potentially Tesla might even consider FSD Beta's intelligence in detection and even prediction of children to be superior especially if including reaction speed. Of course, there's the wisdom of knowing when it's better to drive defensively.

I suppose there could be regulators deciding if self-driving cars are safer than humans even when not following speed limits whether that's on highways with single stack or local roads sometimes with extra conditions. Although turns out there's a trick right now at least in California where autonomous vehicles can basically ignore speed limits and other traffic laws because… "city officials have determined that driverless vehicles cannot be cited for moving violations under current California law." 😆
 
I believe we are officially out of Musk's latest V11 to be available Tweets time line. Since the last few have been based on a "week or two week" phrasing and have overlapped what kind of time line will he offer this week? Come on Elon we are thirsty and need a BIG swig of that oh'so delicious Tweeter artificially flavored Kool-Aid. o_O
I think he will go radio silent for a bit and then come back with “in a few days” then a week or two later “next day or two” then another week later “tonight” lol
 
  • Like
Reactions: powertoold
I can just imagine the V11 challenges facing the team given FSDb has issues with prompt and correct decision-making at intersections, roundabouts, etc but imagine freeway onramps/offramps with quick moving/merging traffic and an FSDb system that prefers to sip and savor roadway scenarios versus gulp scenario data and make quick optimal decisions. The team is probably in the V11.3 crutch devising stage now.
 
  • Funny
Reactions: heapmaster
I often see school zone signs that say "Speed Limit 20 when children are present". That's a tough one to solve.
Not for me. I know what a children looks like. 🤣 This is just an easy reason why non monitored L4 is going to be a long time coming. Another example this morning at a 'kinda blind" (at least to the B-pillar) intersection Beta pulled out quite far and then had to wait on a truck. It was a ATL Midtown curtsy vehicle and they stopped and flashed their lights for me to go ahead and pull out. Of course Beta is obtuse to this gesture and sat there so I had to disengage. Funny that some of the easiest human reactions are going to be the hardest for a L4 vehicle to master.
 
Not for me. I know what a children looks like.
Which one is an adult and which is a child? 🙂

back-person-hoodie-on-road-260nw-1491677906~2.jpg


360_F_8975146_TPGtTl3F7THPBpmrnjKROosM6fDJcNpw.jpg


Answer: the top pic is an adult (mid 20s) and the bottom pic is a teenager (mid teens). How the heck the car supposed to tell the difference?

Easiest solution, just drive the school speed limit at all times. Unfortunately that's just going to annoy a different group of people than the ones complaining it doesn't recognize school zone speed limits.
 
I often see school zone signs that say "Speed Limit 20 when children are present". That's a tough one to solve.
All these posts regarding NRToR and school zones point out to me that States should maintain a public database of these special road restrictions. If not, the only other viable immediate choice is a crowdsourced reporting system maintained by Tesla. Google does this with Waze for road hazards and speed traps. It can't be that hard to do, especially if it was piloted in a limited geographic area.
 
All these posts regarding NRToR and school zones point out to me that States should maintain a public database of these special road restrictions. If not, the only other viable immediate choice is a crowdsourced reporting system maintained by Tesla. Google does this with Waze for road hazards and speed traps. It can't be that hard to do, especially if it was piloted in a limited geographic area.
It's my main reason for pessimism regarding FSDbeta expanding to non-USA countries: how can the software adapt to/"learn" all sorts of different sets of road rules/signs/markings?

If these things have to be hand-coded into the software -> FSDbeta rollout will take immensive time and effort (since you cannot automate it with NN's/deep learning).

If Tesla goes for an automated approach, will there be geofencing? I.e. will FSDbeta have seperate subversions for each country/state? This would mean the software would have to know I'm crossing a boarder and switch to a new set of behaviours it trained upon with different clips?

By following the current progress we know that is not what Tesla is currently going for. They are trying to train the models with clips from all over the USA and even the World. Some things are general ("don't hit anything, recognize all sorts of vehicles and road users") but some things are very location-specific and it is often in these instances the FSDbeta fails. (For example the school zones and turn on red instances you guys were talking about.)

So yeah, I can see FSDbeta nailing the "not hitting anything when traversing the scene" part, but the specific road rules per country/state, I have my doubts in the near term.

But hey, the Tesla AI Team is smarter than me so they'll figure something out I hope.

(EDIT: a good example would be the rules regarding overtaking traffic. In the US (at least California where I visited) you can overtake on either direction. In Belgium you are only allowed to overtake a slower vehicle on the left hand side, UNLESS in some very specific scenarios (for example the slower car is making a left turn, etcetera).
I have not heard anything on Autonomy Day/AI Day 1/2 regarding how they will tackle issues like this. The closest they got is explaining road markings are not to be followed blindly but should act as a guideline and that Tesla FSDbeta is looking for "driveable space". But then the planner has to perform actions that not only cause no harm to others, but also are legal. Tough nut to crack IMO.)
 
It's my main reason for pessimism regarding FSDbeta expanding to non-USA countries: how can the software adapt to/"learn" all sorts of different sets of road rules/signs/markings?

Just because FSD Beta isn't available to customers in countries outside of NA doesn't mean it doesn't exist. Especially in any country currently or previously covered by EU law, they're not allowed to ship it to customer vehicles. But late last year Teslascope posted some proof of V11 FSD Beta running on employee vehicles in the UK:

 
  • Helpful
Reactions: jeewee3000
Just because FSD Beta isn't available to customers in countries outside of NA doesn't mean it doesn't exist. Especially in any country currently or previously covered by EU law, they're not allowed to ship it to customer vehicles. But late last year Teslascope posted some proof of V11 FSD Beta running on employee vehicles in the UK:

I've read about the testing in the UK (since it's not EU the rules are less strict regarding autonomy software) and have since wondered how they're tackling the issue.

When Giga Berlin opened Elon first mentioned FSDbeta coming to EU after regulators would allow it, but he then said there were discussions with regulators ongoing. We've heard little about this since, besides a brief mention on AI Day #2.

I personally can't wait to try the software, even if it's cr*p at first, but I'm not getting my hopes up anytime soon.
 
It's my main reason for pessimism regarding FSDbeta expanding to non-USA countries: how can the software adapt to/"learn" all sorts of different sets of road rules/signs/markings?
I drive back and forth across the US-Canada border from more than 10 times a year and my Tesla has no problem switching from legacy to modern signage and back again.
 
Just got my model y a week ago. I am on the build 2022.44.100 and subscribed to FSD. Requested FSD beta and still in queue. Any idea when the FSD Beta releases would come out for the build 2022.44.100?
I don't think anyone knows but your software release is 'ahead' of the current beta releases so you'd need to wait for the next beta release. May be a few weeks.

See this 'fleet' of 18000-ish cars on TeslaFI . COM as an example of how to watch the current FSD and non-FSD releases
vyJ1gcu.jpg
 
Just got my model y a week ago. I am on the build 2022.44.100 and subscribed to FSD. Requested FSD beta and still in queue. Any idea when the FSD Beta releases would come out for the build 2022.44.100?
There has been a near-perfect observation that "factory builds" that come on new cars and commonly end as yours does with the digits .n00 never directly go to FSDb builds. If this pattern still holds, you should expect to get a non-FSDb "regular" build within a few weeks, with a base higher than 2022.44 and subsequently to get an FSDb build with a higher base than either.

To give a somewhat recent real example: I took delivery on a model 3 which arrived with 2022.28.300 on November 3. I was offered 2022.40.4.1 on November 1, but did not install it, foolishly thinking I could hope to go straight to an FSDb version. Then on December 7 I installed 2022.40.4.2, which got me out of the factory version, but not yet into FSDb. Then I was offered and installed 2022.40.4.10 also known as FSDB 10.69.3.3 on December 17.

Your pace may vary, but I think that sort of sequence is what you can expect. You can perhaps hasten things a little by going to the software tab in the car once a day. That triggers a check, which usually just claims you are already up-to-date, but occasionally triggers a download somewhat sooner that it would have come on its own.