Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
On FSDb 11.4.7.3, anyone getting the annoying reroute to fastest route after picking an alternate route? I have the car set to reroute if saves 30 minutes or more. I always take an alternate route that is only 5 minutes longer but has much less traffic. About a minute after I choose it and start driving the car changed it to its preferred fastest route. Help!
What I have found, that seems to work, is after you have chosen a route immediately touch the compass settings in the upper right of the screen. This doesn't allow time for FSD to change the route back to the one it prefers.
 
Drove the same highway exit as the most recent Tesla video today, traffic about the same.
My experience was that the car didn't slow down enough, came plowing off the highway and accelerated down the off ramp to a red light. It didn't have enough stopping distance to make the either of the two right lanes and bailed.
I had to stop in the next lane over and signal, someone let me in immediately.
Definitely nothing like the speeded up video (where they glossed over the herky-jerky right lane switch.
I'm going to keep trying that one as it is the same junction I take on most commutes.
Current score 30%
 
Drove the same highway exit as the most recent Tesla video today, traffic about the same.
My experience was that the car didn't slow down enough, came plowing off the highway and accelerated down the off ramp to a red light. It didn't have enough stopping distance to make the either of the two right lanes and bailed.
I had to stop in the next lane over and signal, someone let me in immediately.
Definitely nothing like the speeded up video (where they glossed over the herky-jerky right lane switch.
I'm going to keep trying that one as it is the same junction I take on most commutes.
Current score 30%
Why don't you just scroll wheel down the max speed to your liking when approaching that exit?
 
There’s a video circulating on Reddit of an accident while on 11.4.7.3 where the vehicle exits the road, Reddit - Dive into anything. Trying to figure out if it’s legit, but based on the experiences I’m having making a new video on it, I would say it’s at least possible. I’ve experienced some crazy regressions, but something like this is downright scary.
 
There’s a video circulating on Reddit of an accident while on 11.4.7.3 where the vehicle exits the road, Reddit - Dive into anything. Trying to figure out if it’s legit, but based on the experiences I’m having making a new video on it, I would say it’s at least possible. I’ve experienced some crazy regressions, but something like this is downright scary.
Is it the video from the front camera?
 
There’s a video circulating on Reddit of an accident while on 11.4.7.3 where the vehicle exits the road, Reddit - Dive into anything. Trying to figure out if it’s legit, but based on the experiences I’m having making a new video on it, I would say it’s at least possible. I’ve experienced some crazy regressions, but something like this is downright scary.

That's actually very similar to a mountain road that Electrek's Fred Lambert said almost killed him. We had the benefit of an interior view of Fred's experience, and it appeared that he had his foot hovering over the brake around the turns of the mountain road. When he accidentally tapped the brake lightly, FSD Beta disengaged and the car continued straight (toward a cliff) for a brief moment.

So in the video above, the driver either accidentally disengaged FSD Beta, or it suffered a catastrophic failure. Given the nature of the crash was the car continuing forward at a moment when it was meant to be turning, my guess is the former.
 
And that matters in the subject why?
evolution, intelligence, artificial intelligence, machine learning, biological learning, multi sensor criteria control with historical evolution to provide precognition vs. tunnel vision, "vision only FSD" that is why we are discussing dry wipes and parking in a garage after five years. now you know.
 
  • Like
Reactions: Phlier and Pdubs
What I have found, that seems to work, is after you have chosen a route immediately touch the compass settings in the upper right of the screen. This doesn't allow time for FSD to change the route back to the one it prefers.
On mine, the alternate routes went away 3 updates ago. It hasn't offered an alternate route since then. It now shows 1 and only 1 route and is locked on it. For a while, then it will randomly change to a some completely unrelated route that I've never seen before? This was on the way home from town 16 miles away, Route went to 1,688.2 miles away and satellite view changed to somewhere, maybe NJ?

20231018_140754.jpg
 
  • Informative
Reactions: FSDtester#1
I can’t tell. Seems like it. i always question legitimacy on things like this, but I figured someone on here would run across it and corroborate or disprove it.
My car camera cannot have beautiful video like that. Can HW4 camera have video like that?
That video does not give full details.
My car veered to the right shoulder many times when I entered a curved ram to freeway. It seems the car control software overcompensates too much. I am always ready to take over at that ramp.
 
  • Like
Reactions: Pdubs
. We had the benefit of an interior view of Fred's experience, and it appeared that he had his foot hovering over the brake around the turns of the mountain road. When he accidentally tapped the brake lightly, FSD Beta disengaged and the car continued straight (toward a cliff) for a brief moment.
I don't have a video. I would have been killed by a 18 wheeler on I 295 entrance ramp near Richmond when the car did not slow down and I reduced the speed by scroll wheel and the truck came very fast and while I disengaged the car went fast and I missed by a hairline to post about it. Still shaking.
 
Why don't you just scroll wheel down the max speed to your liking when approaching that exit?
Because any response that starts with "Why don't you just" completely misses the point and glosses over the driver needing to intervene.
Did you see the driver in the video slowing the car down by using a scroll wheel or anything? Did you see how short the ramp is and where it has to go to get in line.
It's not a simple case of slowing the car down because it misread the speed, the car was accelerating down the ramp up to 60mph whereas the video shows it slowing down. The car completely misread the stopped traffic and rather than wasting time with rabid wheel scrolling "why don't I just", I took over and braked hard enough to try correct the inappropriate, needless acceleration and pull into the the line.
There is no "why don't you just" when there are impact targets in front of you and the car obviously doesn't have a clue.
 
Because any response that starts with "Why don't you just" completely misses the point and glosses over the driver needing to intervene.
Did you see the driver in the video slowing the car down by using a scroll wheel or anything? Did you see how short the ramp is and where it has to go to get in line.
It's not a simple case of slowing the car down because it misread the speed, the car was accelerating down the ramp up to 60mph whereas the video shows it slowing down. The car completely misread the stopped traffic and rather than wasting time with rabid wheel scrolling "why don't I just", I took over and braked hard enough to try correct the inappropriate, needless acceleration and pull into the the line.
There is no "why don't you just" when there are impact targets in front of you and the car obviously doesn't have a clue.
I guess you missed my point. I read that this is your usual exit and you know the behavior of FSD. Forget about whether there is an intervention or not. Of course if you cannot find a better solution. By all means, always take over when you approach such exit. Assuming that you want to continue using FSD and not to disengage FSD, you can always try to move over and slow down the max speed to below 60 mph well ahead of such exit.
 
  • Like
  • Disagree
Reactions: legendsk and Pdubs
There’s a video circulating on Reddit of an accident while on 11.4.7.3 where the vehicle exits the road, Reddit - Dive into anything. Trying to figure out if it’s legit, but based on the experiences I’m having making a new video on it, I would say it’s at least possible. I’ve experienced some crazy regressions, but something like this is downright scary.
Lane control looks a loose after that first series of sunlight blasts and canopy shade. But the second blast of sunlight and shade probably killed FSD camera data. Too much windshield glare. Steering input changed when the driver took over albeit a bit too late. Not much time to react in those scenarios.
 
What? You know this how?
I have been trying for months to get out of the FSD beta test program, with no success. Others have reported the same, though I think one TMC poster was allowed out. We purchased FSD and volunteered for the beta test program back in the safety score days. (If memory serves me, our safety score was perfect even after we hit a deer, which took the car out of action for 2 months waiting for parts. We did eventually get into the testing program, though.)

So, owners of FSD are not allowed to leave the beta test program, but subscribers can drop the subscription which does take them out. My understanding is that re-subscribing does not mandate returning to the testing branch of the software.
 
So don’t understand your complaint and suspect Strongly that the issue with FSD is your understanding of how it works. It needs a Nav path to operate correctly. You don’t simply use the stalks like joy sticks to push it where you want to go. No path no play.
That is what it does, but it does take blinker input to make lane changes. When navigation is inactive, at intersections it could turn in response to a manual blinker activation. I think that might be a nice feature.
 
That is what it does, but it does take blinker input to make lane changes. When navigation is inactive, at intersections it could turn in response to a manual blinker activation. I think that might be a nice feature.
That will be nice if it can do that, currently it just go a straight line or follow the path, stops at red lights and stop signs, it does change lane on blinker input but it won't turn. I used this plenty of times when I just do very short trip and don't want the Tesla insurance dinged me for hard braking on red lights (v2 eliminated that), downside is sometimes I forgot I am on self driving then it go straight instead of making a turn.
 
Last edited:
  • Like
Reactions: Pdubs