Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
My preference for HW5 is adding two cameras with 90 degree wide angle view to the existing windshield area for the following reasons: Not sure though if they could be added in such a way to avoid sun glare. The key is that every time a human leans forward to monitor FSD decisions at intersections due to any type of obstruction means FSD does not see as well as the human.
  1. Protected by rain/snow since the wipers/defrost clear the windshield
  2. Cameras are high so they will see over vehicle hoods and snow banks
  3. They are far enough forward to be a real improvement over the human driver view, mirrors and A-pillar.
The reason I don't really like that top-center windshield location is that it's nearly useless for a very common use case: seeing the high-speed oncoming lanes, past the obstruction of a left turning car in the oncoming turn lane or turn bay.

This is already somewhat of a challenge for a human who can lean towards the driver side window. The center windshield location, even though it's high up, is markedly inferior. The front left corner location is the best for this, the A-pillar or driver side mirror location next best, and a forward-facing repeater camera (that can at least see along the fender) is the third best.
 
The reason I don't really like that top-center windshield location is that it's nearly useless for a very common use case: seeing the high-speed oncoming lanes, past the obstruction of a left turning car in the oncoming turn lane or turn bay.

This is already somewhat of a challenge for a human who can lean towards the driver side window. The center windshield location, even though it's high up, is markedly inferior. The front left corner location is the best for this, the A-pillar or driver side mirror location next best, and a forward-facing repeater camera (that can at least see along the fender) is the third best.
Anything would be an improvement but the front left corner location cannot see over hoods or snow banks but it's front location would be so much better.
 
  • Like
Reactions: JHCCAZ
A lot of stuff we don't know but the one thing we do is that the car could self-park when it was shipping with USS and cannot do it now (I've posted a couple of times on this, won't do it again now). I get it that self-driving has never been done before, and it will be super impressive if Tesla gets there with just the cameras, but parking sensors, rain sensors, etc - all this has been done before. Why not keep this tried and tested - and cheap - tech until you have a vision-only solution?
Agreed.
 
A lot of stuff we don't know but the one thing we do is that the car could self-park when it was shipping with USS and cannot do it now (I've posted a couple of times on this, won't do it again now). I get it that self-driving has never been done before, and it will be super impressive if Tesla gets there with just the cameras, but parking sensors, rain sensors, etc - all this has been done before. Why not keep this tried and tested - and cheap - tech until you have a vision-only solution?
Is the vision only approach really have the potential to be that much better than sensors? I feel like there are so many environmental factors that can impair the cameras that other sensors just aren't susceptible to. I'm struggling to see how camera only can outperform a competent sensor suite.
 
Is the vision only approach really have the potential to be that much better than sensors? I feel like there are so many environmental factors that can impair the cameras that other sensors just aren't susceptible to. I'm struggling to see how camera only can outperform a competent sensor suite.
You’re struggling to see because there’s nothing to see. Vision only will always be inferior in many, many ways.
 
Anything would be an improvement but the front left corner location cannot see over hoods or snow banks but it's front location would be so much better.
Idea: bring back the old fashioned tall AM FM antenna, that people used to put silly antenna toppers on, and instead put on a 360° camera ball. Extending above the driver's side A pillar and post-processed to cancel out all the swaying movement from the video. Have it wrapped to advertise X or Grok, or make it a little Mars globe, whatever Elon likes. It could provide bird's eye parking view, cross traffic vision, Omni-Sentry, parking lot mapping for the ASS database, you name it.

I would gladly even pay $8 a month for it...
 
What are all of you talking about. Vision can be as safe as human drivers. The issues are areas of reduced visibility such as fog and blind areas, the reason behind the Phoenix radar. You have furries engineering solutions being paid $200k a year employed by Tesla to tackle this while some people sling ideas on forums. The challenge is efficiency in leverage of existing hardware.
 
Here are the updated pictures taken at the same time. Hope this helps.

Picture 1- Yellow line shows what the B-pillar can see (yellow line) compared to the driver leaning forward can see. This likely explains why FSD sometimes begins to enter the intersecting requiring an intervention because the driver see's a car coming. FSD simply didn't see the vehicle. Notice how far the van has moved towards the center line since the car is already protruding into the crossing road. FSD never creeps this far by itself so what the B-pillar camera actually sees is worse than these photos show.

View attachment 994089

Picture 2- B-pillar camera view. Note same shadows.

View attachment 994090
Another issue makes this difficult as well. When a vehicle begins to become visible from behind the fence, it will take a moment for it to become exposed enough to be distinguished from the background, and longer still before the Tesla can recognize and respond to it. Our eyes are marvelous at detecting motion, enabled in part by neural processing in our retinas before even being sent to the brain, if memory serves, especially for motion far to the sides of our central, high resolution vision.

It would be nice to have a matching image from the Left Fender camera. I checked on my MY, and found that the fender camera can see from roughly 15º behind directly abeam to directly behind, around a 75º field of view. In your images, it looks like your car might already be turned enough for the fender cam to see even farther down the road than your lean-forward view.
 
Another issue makes this difficult as well. When a vehicle begins to become visible from behind the fence, it will take a moment for it to become exposed enough to be distinguished from the background, and longer still before the Tesla can recognize and respond to it. Our eyes are marvelous at detecting motion, enabled in part by neural processing in our retinas before even being sent to the brain, if memory serves, especially for motion far to the sides of our central, high resolution vision.

It would be nice to have a matching image from the Left Fender camera. I checked on my MY, and found that the fender camera can see from roughly 15º behind directly abeam to directly behind, around a 75º field of view. In your images, it looks like your car might already be turned enough for the fender cam to see even farther down the road than your lean-forward view.
The fender camera won't help here but I will try and capture an image from that intersection for confirmation. When I got my HW4 Model Y I checked all the camera views. What I found was the front "wide" view and "fender" view are of little help in situations like this. I was actually very surprised how limited the "wide" view was.

For example wide view

FSD Wide Angle View.jpg
 
  • Like
Reactions: FSDtester#1
Is the vision only approach really have the potential to be that much better than sensors? I feel like there are so many environmental factors that can impair the cameras that other sensors just aren't susceptible to. I'm struggling to see how camera only can outperform a competent sensor suite.
Vision (cameras) only will never be as good as Ultrasonic Sensors and cameras. I agree that rain and snow will always limit the capability of cameras. Removing the radar was also a stupid idea. Radar can see further ahead in rain, snow, and fog. Tesla removed the Ultra Sensors and Radar to cut cost. Tesla cars will not reach level 5 with Tesla Vision only.
 
Vision (cameras) only will never be as good as Ultrasonic Sensors and cameras. I agree that rain and snow will always limit the capability of cameras. Removing the radar was also a stupid idea. Radar can see further ahead in rain, snow, and fog. Tesla removed the Ultra Sensors and Radar to cut cost. Tesla cars will not reach level 5 with Tesla Vision only.
Current cars won't reach autonomy in any meaningful ODD.
 
Overall, it was pretty amazing and made the trip a lot easier. It's always good when you appreciate your car more after a road trip.

Rented Model 3 with EAP for 500 miles of driving over Thanksgiving weekend (most with EAP disabled).

Wife was happy that the driving was so good. She said it was like back to the old days where I was a good driver. She speculated (correctly of course) that it was because I wasn’t using that terrible FSD.

Appreciate my car less now. EAP superior in every way???
 
Starting to look like 11.8.x will be the holiday update. Despite what may want to believe V12 is so far from making it to release and only available to a select few employees that there is just no way it could make it through all the steps needed reach wide release in just a few days.
Can't the exact same thing be said for FSDb 11.4.8? It is only available to a select few employees right now... (There is no 11.8 that I am aware of.)

But both have likely been in the hands of the QA team for a while. (Of course, we don't know how big the QA team is.)
 
  • Funny
Reactions: jebinc