Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
I've found a "bug" or consistent issue with 11.4.2 today. It doesn't deal well with people turning left while waiting for an unprotected right. There was a bunch of cars turning left in front of me, and 4.2 just decided it was impatient and wanted to squeeze in, lol. I found this impatient behavior somewhat consistent on other unprotected rights.
I noticed one change in 11.x that wasn't there in 10.69 - that is nice.

While waiting for an unprotected right turn at a two lane road - if a car on the left lane takes a left turn - FSD knows its safe to take the right turn.

- FSD is waiting for an unprotected right turn at a T junction
- There is a car on the left lane that wants to turn left. This car is blocking FSD's view of the road from the left, so it can't take a right turn.
- The car on the left starts taking a left turn
- FSD figures out that if the car on the left is taking a turn, its probably safe to take the right turn and turns ...
 
Do you have any examples of a state that has special requirements for deaf people to drive a car? In Texas, there are none, and no hearing tests are given in order to get a license. Some states may require things like additional mirrors to provide a wider field of view for seeing emergency vehicles. Teslas obviously have that covered with the multiple cameras.
I get what you are saying but where is the benefit of not supporting audio? It may be possible to be some degree of safe without audio support but why make the march of 9s more difficult. Just something to think about. I will be surprised if audio is not supported by HW 5.
 
I get what you are saying but where is the benefit of not supporting audio? It may be possible to be some degree of safe without audio support but why make the march of 9s more difficult. Just something to think about. I will be surprised if audio is not supported by HW 5.
I'm not saying that there is no benefit, only that interpreting audio is not an obvious requirement for self driving. If you are driving at high speed, it is likely that you will not hear an emergency vehicle coming up from behind. And, plenty of people crank the music volume up to the point where hearing emergency vehicles is unlikely, yet still, arguably, can drive.
 
it would be a fairly large download for a geometrical road / parking map of the entire continent
But is it really that much more than just roads to include as part of the multi-GB map updates like NA-2022.44-14515? Here's OpenStreetMap Tesla Gigafactory Texas area defined with 18 nodes/coordinates. Compared to one of the service roads in one direction along one side of the building defined with 97 nodes. (Vs one of its larger parking lots 26 nodes and the building itself with 9 nodes. [It's actually defined as an octagon, so these areas need a repeated last node to enclose. ;)])

I think that the dense autolabeling has seen so many roads / intersections / parking lots, that it's able to "hallucinate" road geometry beyond its vision
In this particular case, I think that could have been Vision as I turned into the alley with visibility of the main street parallel to the alley, so driving down the alley and seeing a cross street and generally "knowing" from a lot of intersections that streets meet at 90º angles to then predict the cross street should meet up with the main street. I wouldn't be surprised if the neural networks can infer the shape of the road just based on placement of trees, street lights, fences, houses, etc.; and similarly, it can probably infer a lot from less than a second of good visibility.

Karpathy showed this type of behavior back at AI Day 2021. Watch how the side street predictions update with better visibility but also taking into account what might have been seen a while ago.
 
But is it really that much more than just roads to include as part of the multi-GB map updates like NA-2022.44-14515? Here's OpenStreetMap Tesla Gigafactory Texas area defined with 18 nodes/coordinates. Compared to one of the service roads in one direction along one side of the building defined with 97 nodes. (Vs one of its larger parking lots 26 nodes and the building itself with 9 nodes. [It's actually defined as an octagon, so these areas need a repeated last node to enclose. ;)])

Yea, I think a full vector map accurate to +- 1 foot would be a large download: every node would need global coordinates and relationships to one another. The map would be in the gigabytes of data I think.

Also, it wouldn't fit with Tesla's philosophy with fsd :)

I was noticing the visualizations today, and even for far away predictions, there was quite a bit of uncertainty and fuzziness when the car is in motion.

Despite this, the visualizations / predictions are so good that many people think it's an HD map.
 
Looks from TeslaFi like several people with 2023.12.10 (11.3.6) have been updated to 2023.12.11!

So, anybody know if this is a new 11.4.2, or maybe it's 11.4.3? Even possible it's 11.3.7, which would be rather disappointing. It's almost certain that it's another FSD beta version.
 
I subbed to try FSD 11.3.6 over the past few weeks in my area with one 300 mile road trip. It’s really neat beta software that does make road trip highway driving nicer.

For my daily drive currently, it’s just too expensive and doesn’t add enough value. Too many interventions in my local town with poor lane selection, bad lane changes prior to navigation route (will change left then right). The steering is way too jerky at times. My road by the house with speed limit signs posted 45 but yet the car decides to default down to 25 instead of 45? Weird. Im noticing the same speed limit issues with FSD as I was with standard autopilot. Why are so many highways in my area not mapped with the correct speed limit? It’s ridiculous imo. The speeds haven’t changed ever on these roads?

I’ve had several very bad safety related interventions to where my car will try to pull out in front of someone nearly causing a t-bone / rear end event. I have to slam on the brakes last minute, because I give the car the benefit of the doubt just assuming for sure it isn’t about to pull out in front of the car. On my commute on a narrow two lane road I always get phantom braking when cars are coming in the opposite lane.

There are times where the same route FSD generally does pretty well at, will downright suck - giving me the impression of not wanting to use it.

The same route can give me many different reasons for disengagement causing what I consider an intermittent defect which nullify overall product enjoyment and usage for me.

I figured I’d give it a few more months so I can see for myself the software progression and from there judge what I call the quality of life improvement. If my disengagements and constant speed limit bug reporting isn’t fixed, then I’ll know general autonomy is still way more than a year out. When it works, and I get those non intervention routes, it’s amazing. Unfortunately those are more of the exception in my area.
 
Ran over 500 miles of FSD this weekend on many different routes. This version is more polished but I finally found my step back issue. the car is back to trying to turn one block to soon. Jumping into the right turn lanes then trying to come back on to the main road. Really got frustrating by last night. I haven't had this issue in a long time. Typically drives to and from work without issue.

It makes you wonder sometimes if the car behaves at first then falls back into old bad habits.
 
The fact that people feel the need for these signs is troubling.

If your testing of FSD is affecting the flow of traffic, disengage and drive manually.

Do not force others to deal with Tesla's erratic FSD behavior.

Do you also want Student Drivers to remove their car signs?
 
Last edited:
The problem with these magnets/signs is the small print that cannot be read until you're right behind the vehicle and/or tailgating.
Who cares about the Tesla logo. Frankly just a "Student Driver" or other similar wording with larger font would be better noticed and more effective.
 
Last edited:
  • Like
Reactions: nvx1977
In my experience, it's not the letter of the law that causes the most issues for me that require me to disengage.

It's the over cautiousness at intersections (just how far is it going to creep into a 4-way stop before it decides it's safe to go?), erratic behavior making turns (rapid back and forth steering adjustments), turning on the blinker when not needed, changing lanes back and forth when not needed, etc.

FSD drives like it's a high, drunk, teenage, elderly driver, all combined in one.

And every now and then it nails a situation perfectly and you wonder "now why can't it drive like that all the time?".
Sounds like you might have a Model S/X lol
 
  • Funny
Reactions: FSDtester#1
It does this on my M3, especially the unnecessary signaling just going forward or around a bend. It will blow by at least two different stop signs consistently, and will try to be in right hand lanes that are mostly parking except during rush hour.
But if you talk to some members on here they’ll swear FSDj is godlike, worth the price, and changed their lives.
 
  • Like
Reactions: FSDtester#1
I noticed one change in 11.x that wasn't there in 10.69 - that is nice.

While waiting for an unprotected right turn at a two lane road - if a car on the left lane takes a left turn - FSD knows its safe to take the right turn.

- FSD is waiting for an unprotected right turn at a T junction
- There is a car on the left lane that wants to turn left. This car is blocking FSD's view of the road from the left, so it can't take a right turn.
- The car on the left starts taking a left turn
- FSD figures out that if the car on the left is taking a turn, its probably safe to take the right turn and turns ...
This sounds unsafe to me. The car making the left turn will clear oncoming traffic from the left quickly as it just has to cross the lane. Your car making a right turn will need to complete the turn and then accelerate to the speed of traffic, which takes much longer.

I never assume there is a big enough space in traffic for me to make a right turn when I cannot see oncoming traffic and judge for myself. I disengage if FSD attempts this.

GSP
 
11.4.2 report: An attempt at running a red light yesterday.

So, there's this one intersection that, when commuting to and fro work, FSD-b has had a periodic habit, when it's first in line at the red light, to make an attempt at running the red light. It didn't happen to do that with 10.69, but, like I said, it didn't do it all the time. But three times in four months or so, and only there, is enough to make that behavior stick in anybody's mind.

So, when commuting, that particular intersection was crossed going from south to north. Fine.

Yesterday was making a pizza run starting from a not-usual voting location and, as it happened, was going from west to east on the same road that was crossed during commuting where the fun occurred. Red light; car came to a halt, and, as it happened, was first in line.

And the blame FSD-b started to run the red light. Hit the brakes, reported the incident. As noted, this time was crossing the intersection west to east.

This is just weird. Now that I think of it, I think that this might be the only intersection where FSD-b has tried to run a light. And, across the different versions, in two different directions.

Some kind of bizarre map error? There's nothing particularly weird about this intersection. The roads are at right angles. It's very nearly on the flat, with a slight rise from south to north. There's no buildings particularly blocking the view. But FSD-b tries to commit suicide there. There's four lights hung around the intersection, with but one visible in each direction. But, still.