Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
In addition to this, this evening, I had 11.4.4 slow and stop for a raccoon on my way home from our neighbors house just a bit ago. This is the stuff that makes me not regret paying for FSD.
Great news! I have a long history of disengaging for chickens/deer/squirrels/ etc. (currently 11.4.7.3). Fsdb killed a squirrel in 2022. Recent disconnects for squirrels and deer on 11.4.7.3.

Animals are unpredictable - and come in multiples (deer). My experience is you see an animal and never assume they will cross the road and be on their way (groundhog years ago, after crossing the road, made a U turn and ran in front of my car). Suicidal, some of them. Fsdb needs to recognize that they are unpredictable and take this into account.
 
I have never said move cameras. What is needed are 2 additional cameras closer to the front. Of course that would require additional processing power to handle the cameras output which may be contributing to the decision by Tesla not to add them.
Perhaps one thing we can all agree on - the current hardware is inadequate for "real" FSD.
 
Perhaps one thing we can all agree on - the current hardware is inadequate for "real" FSD.
Agree.
For those who disagree I suspect it's related to where they live and the road layout. I drove FSD for a week in Texas recently. What a difference compared to what I'm used to in greater Boston.
 
Agree.
For those who disagree I suspect it's related to where they live and the road layout. I drove FSD for a week in Texas recently. What a difference compared to what I'm used to in greater Boston.
I seem to be massively out voted on this, so likely I'm wrong? But a software update in July or August reduced the camera vision on mine by > 50% and introduced lots of new problems that appeared to be vision related, e.g., fail to see red lights until 10 feet from them, fail to see oncoming semis until 100 feet from them, phantom vehicles flickering in and out of existence that were 50 feet off the highway, even well past the shoulder, etc. It felt like the decision had been made to reduce the number of photons coming in, trading the accuracy of the ultimate decision, for speed?

So yeah, if the problems are a result of the hardware being too slow and unable to resolve the images fast enough to figure out what it is seeing, in time to take the correct action, then hardware is an issue. But it was so much better prior to the update, I believe the jury is still out on camera location and performance. I don't think it's fair to condemn the hardware, when the software is so clearly error ridden stuff and is so poorly understood / designed that they are unable to change the volume indication from horizontal to vertical without screwing up the windshield wipers - and after 3 more updates, they're still screwed.

And I'm in Texas and have only been in Oklahoma, Kansas, Nebraska, Arkansas, Missouri and Iowa with mine. I'm sure while downtown Houston is a horror it is quite likely very different from down town Los Angeles or New York.
 
Last edited:
I seem to be massively out voted on this, so likely I'm wrong? But a software update in July or August reduced the camera vision on mine by > 50% and introduced lots of new problems that appeared to be vision related, e.g., fail to see red lights until 10 feet from them, fail to see oncoming semis until 100 feet from them, phantom vehicles flickering in and out of existence that were 50 feet off the highway, even well past the shoulder, etc. It felt like the decision had been made to reduce the number of photons coming in, trading the accuracy of the ultimate decision, for speed?

So yeah, if the problems are a result of the hardware being too slow and unable to resolve the images fast enough to figure out what it is seeing, in time to take the correct action, then hardware is an issue. But it was so much better prior to the update, I believe the jury is still out on camera location and performance. I don't think it's fair to condemn the hardware, when the software is so clearly error ridden stuff and is so poorly understood / designed that they are unable to change the volume indication from horizontal to vertical without screwing up the windshield wipers - and after 3 more updates, they're still screwed.

And I'm in Texas and have only been in Oklahoma, Kansas, Nebraska, Arkansas, Missouri and Iowa with mine. I'm sure while downtown Houston is a horror it is quite likely very different from down town Los Angeles or New York.
Perhaps the misalignment of views is definitional - When I say, "Real FSD," I mean stuff Level 4/5 is made of, and not a quasi usable (on occasion) Level 2 system.
 
Or the view shown for driver is not what the computer can actually see. As shown in the driving preview which I know it can see much further beyond the black horizon shown. Recently noticed this driving at night the visualization to the black horizon was minimal Yet at the top of the screen it was picking up the green light intersection that was easily a quarter mile away which I could barely see myself squinting off in the distance. I was rather impressed with how far it could see.
There is also the question of the car screen resolution and gray scale performance, neither of which are particularly good for video. You would need to be able to zoom in on each cameras view to judge what and what not the cameras are capable of resolving. (Of course, to the OPs point, if the camera view is blocked then no amount of resolution or processing will help!).
 
Another example of the B-pillar camera. Today was the first time I've driven this road. The front of the car is already into the edge of the crossing road and since there is no shoulder cannot creep any further. By leaning forward I was able to safely see but I think it's clear the B-pillar cannot see far enough for FSD to make a safe decision so I disengaged.

B-Pillar Westford.jpg
 
There are cases the view is still obstructed regardless the position of the camera (it does not mean I don't want Tesla to add 3 more cameras to the headlight/front bumper and 2 more cameras in the rear bumper corners - don't shoot me). When making right turn on a street that has 2 or 3 right turn lanes, the guys on my left 2 lanes always obstruct me regardless what kind of car I drive. This happens daily for me.
 
  • Like
Reactions: EVNow
Perhaps one thing we can all agree on - the current hardware is inadequate for "real" FSD.
I do not agree. I do understand that this is your opinion, and you have given decent reasons. I however remain hopeful that the hardware I already bought can eventually be programed to drive well enough for me to trust it on city streets as much as I do now on highways. Is that "real" FSD?

So, one issue is what you mean by "real" FSD. Perhaps you mean fully autonomous driving. Or L4 or 5, (or better yet L6 which might include driving on Mars. Or to Mars.) One could argue the FSD is already literally capable of full self driving: It steers, turns, brakes, accelerates, navigates, avoids pedestrians (and deer, sometimes, we hear). Not always well, we all know, but it does it. Millions of miles of it, we've been told. But clearly you mean something different. Suggestions that FSD will soon be able to drive from coast to coat without intervention was obviously fanciful, especially given the absence of those robot snake/arm-cable equipped SuperCharging stations, so this is probably not what you mean either. Heck, FSD won't even find a spot to park yet.

Another issue is that proving the hardware is inadequate would be difficult. Proving is is adequate is hard enough, but even if someone does eventually demonstrate a system which does what you want by using more elaborate hardware, that will not prove that it is impossible with the current hardware.

We use many tricks to understand the environment around us besides the static vision gaps described here as limiting FSD. For example, we notice motion in gaps through which we can see. We use differential motion in our visual field to tell what direction we are moving. We observe how other cars are behaving. We look at the wheels of a stoped car to tell if it is starting to move. Obstructed vision at intersections is tough, and it does cause plenty of accidents. But pedestrians are already much safer around FSD, because we humans have to look left when turning right, while FSD looks both directions at once.

In any case, it is hard to agree with a vague prediction of an untestable assertion. I agree that your speculation is not unreasonable, but I don't agree that it is true. Also you state it as a fact to agree on, it is only an opinion, a speculation at that, with which I disagree.

If you asked if I think the hardware might be inadequate, I would say I do think that it might be, but I hope not. I will continue to try and help Tesla make it better and see how far we get. But don't let the gap in FSD's side vision block your view of the big picture. ;-)
 
No one is suggesting moving cameras. One proposal would be to ADD cameras, to be able to mimic human capability in certain scenarios.

Obviously a human has much better capability than simple 360 vision (though obviously they have to work pretty quick to construct the 360 view, they can do that and more! ).
Actually I’ve seen several people suggesting moving the cameras. Personally I don’t think we have the data to make a definitive argument either way, though.

To your point, humans have more ability to deduct and extrapolate giving them more ability to compensate for visual limitations.
 
I do not agree. I do understand that this is your opinion, and you have given decent reasons. I however remain hopeful that the hardware I already bought can eventually be programed to drive well enough for me to trust it on city streets as much as I do now on highways. Is that "real" FSD?

So, one issue is what you mean by "real" FSD. Perhaps you mean fully autonomous driving. Or L4 or 5, (or better yet L6 which might include driving on Mars. Or to Mars.) One could argue the FSD is already literally capable of full self driving: It steers, turns, brakes, accelerates, navigates, avoids pedestrians (and deer, sometimes, we hear). Not always well, we all know, but it does it. Millions of miles of it, we've been told. But clearly you mean something different. Suggestions that FSD will soon be able to drive from coast to coat without intervention was obviously fanciful, especially given the absence of those robot snake/arm-cable equipped SuperCharging stations, so this is probably not what you mean either. Heck, FSD won't even find a spot to park yet.

Another issue is that proving the hardware is inadequate would be difficult. Proving is is adequate is hard enough, but even if someone does eventually demonstrate a system which does what you want by using more elaborate hardware, that will not prove that it is impossible with the current hardware.

We use many tricks to understand the environment around us besides the static vision gaps described here as limiting FSD. For example, we notice motion in gaps through which we can see. We use differential motion in our visual field to tell what direction we are moving. We observe how other cars are behaving. We look at the wheels of a stoped car to tell if it is starting to move. Obstructed vision at intersections is tough, and it does cause plenty of accidents. But pedestrians are already much safer around FSD, because we humans have to look left when turning right, while FSD looks both directions at once.

In any case, it is hard to agree with a vague prediction of an untestable assertion. I agree that your speculation is not unreasonable, but I don't agree that it is true. Also you state it as a fact to agree on, it is only an opinion, a speculation at that, with which I disagree.

If you asked if I think the hardware might be inadequate, I would say I do think that it might be, but I hope not. I will continue to try and help Tesla make it better and see how far we get. But don't let the gap in FSD's side vision block your view of the big picture. ;-)
tl;dr
 
  • Funny
  • Disagree
Reactions: STUtoday and Pdubs
Another example of the B-pillar camera. Today was the first time I've driven this road. The front of the car is already into the edge of the crossing road and since there is no shoulder cannot creep any further. By leaning forward I was able to safely see but I think it's clear the B-pillar cannot see far enough for FSD to make a safe decision so I disengaged.

View attachment 990159

What does the wide angle show in this situation?
 
Canceled? FSD works find for me in all sorts of rain. 11.4.7.3.
On cold damp days, it's not unusual for FSD to have issues with fogging inside the glass in the camera housing. I've recently had FSD disable itself until the area heats up enough to clear the glass.

And, the current version has a wiper issue that causes them to not wipe often enough in light rain unless you manually set the speed. This can lead to FSD to bail out with the red hands of death. I never had wiper issues until this version.

11.4.4 worked great for me in the rain.
 
  • Informative
Reactions: FSDtester#1
No one is suggesting moving cameras. One proposal would be to ADD cameras, to be able to mimic human capability in certain scenarios.

Obviously a human has much better capability than simple 360 vision (though obviously they have to work pretty quick to construct the 360 view, they can do that and more! ).
Tesla is obviously missing a camera in the front of the car, without it there's no hope of self parking or 360 view or adequate visibility in intersections. No amount of software updates can compensate for that.
 
  • Like
Reactions: jabloomf1230
What does the wide angle show in this situation?
I didn't save the wide angle but the wide angle view would not have helped in this situation.
Here is a wide angle view taken in my driveway which shows how limited the view really is. Frankly I was surprised how little the wide angle helps at crossing intersections. This is why B-pillar obstructions are so problematic.

FSD Wide Angle View.jpg