Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
The public thinks that FSD is out and that it does exactly what’s shown in that 3 year old video. They don’t know that only the beta is out and even that can’t perform as well as the video.

Based on the large number of improvements still needed I expect a couple months.

Also they seem to be using all the same code as 10.69.x - it really looks basically identical, and they have said as much - so there’s a question about whether they will make any big changes as they have made noise about (and deal with all the regressions) before wider release. In that case could be 3-4 months or more. Though I suspect that won’t happen and they will punt the full rewrite until later.
@AlanSubie4Life, you're being too pessimistic.
 
@AlanSubie4Life, you're being too pessimistic.
I definitely hope I am wrong. Maybe we’ll have it before April 1st! Or maybe on April 1st. They clearly will have to do another rev or two though, and that takes time.

As you know, I’m generally quite optimistic, but I do like to try to stay grounded in reality whenever I can.
 
The "nags" are in place because anyone with a Tesla and the money to purchase FSD can drive all over the US and Canada. That's a whole lot of jurisdictions that need to be comfortable with the concept. The geofenced approaches only have to cope with a few jurisdictions. And we've already seen one of them (SF) with officials expressing concerns. The whole AD issue hinges on attaining a safety standard well above the median value.

My personal view, based on financial considerations alone, is that robotaxis make sense in the short term (5 years) for buses and smaller people movers that operate at low speeds.

Personally, I would be very happy if they just removed nags on closed access highways (with the exception of sections where the car sees construction cones), similar to Super Cruise and BlueCruise. They could even start small and just do major interstates first. It's odd to me that Tesla hasn't made any attempt at this yet. Even though FSD Beta is doing some things that are much more impressive than any other car available for purchase, many people still see Super Cruise and BlueCruise as more impressive and desirable purely because of the hands off aspect.
 
Personally, I would be very happy if they just removed nags on closed access highways (with the exception of sections where the car sees construction cones), similar to Super Cruise and BlueCruise. They could even start small and just do major interstates first. It's odd to me that Tesla hasn't made any attempt at this yet. Even though FSD Beta is doing some things that are much more impressive than any other car available for purchase, many people still see Super Cruise and BlueCruise as more impressive and desirable purely because of the hands off aspect.
Without having Inferred it is extremely hard to identify the eyes. Tesla has been trying to use the interior camera for a couple of years now but the results are inconsistent. Seems Tesla may have hit a wall and is making very little progress. They may add IR to the new rumored revised versions of the 3/Y but that won't help us legacy owners.
 
Without having Inferred it is extremely hard to identify the eyes. Tesla has been trying to use the interior camera for a couple of years now but the results are inconsistent. Seems Tesla may have hit a wall and is making very little progress. They may add IR to the new rumored revised versions of the 3/Y but that won't help us legacy owners.
I thought that the 3/Y have had IR for the "selfie" camera for a long time now...
 
Without having Inferred it is extremely hard to identify the eyes. Tesla has been trying to use the interior camera for a couple of years now but the results are inconsistent. Seems Tesla may have hit a wall and is making very little progress. They may add IR to the new rumored revised versions of the 3/Y but that won't help us legacy owners.

I guess I just assumed that the camera was able to see well enough. I might have to find a deserted road and do some testing by squinting my eyes nearly closed and see if it notices at night.

Even if lack of inferred and inability to see eyes at night is the issue, I would gladly take a hands off feature that only allowed it during the daytime over nothing.
 
I guess I just assumed that the camera was able to see well enough. I might have to find a deserted road and do some testing by squinting my eyes nearly closed and see if it notices at night.

Even if lack of inferred and inability to see eyes at night is the issue, I would gladly take a hands off feature that only allowed it during the daytime over nothing.
In daytime if you wear sunglasses it can be fooled. It relies on AI to find and track your eyes so hats or other objects can confuse the AI. Even a picture of a person can fool the camera. So there is no safe way that I know of for Tesla to offer this in a L2 system. Hopefully Tesla will figure something out.

Screenshot 2023-03-14 at 5.41.07 PM.png
 
Last edited:
In daytime if you wear sunglasses it can be fooled. It relies on AI to find and track your eyes so hats or other objects can confuse the AI. Even a picture of a person can fool the camera. So there is no safe way that I know of for Tesla to offer this in a L2 system. Hopefully Tesla will figure something out.
Agree, My prescription sunglasses have a dark look to them but what I see is polarized as amber. Camera is easily fooled. FSD has only warned me a couple of times and I have made a point of looking at the display for far too long. (had passenger with me). Far as I'm concerned with my sunglasses it's useless.
 
Tesla has been trying to use the interior camera for a couple of years now but the results are inconsistent.
Definitely the computer vision revolution has a ways to go!

As a human, using the interior camera, it’s quite easy to tell if the driver is paying attention (and even if they have hands on wheel, even though you can’t see the hands or the wheel!).

But unfortunately the computer vision just isn’t there yet.

No need for IR eye detectors (in the day at least). Just keep a very close eye on head angle and for a given driver (every driver will look different), it is easy to figure out what is happening and whether they are paying attention - even if they are wearing sunglasses.

Appears not to be easy for a computer though! Humans are much better at such tasks (for limited durations), and we have plentiful evidence of that now.

I’ve spent quite a bit of time looking at the interior camera view, and it is quite effective indeed for this purpose, even though it is completely non-optimal and not intended for such a purpose.

The example above is easy to identify for me too.

Human vision and compute is pretty amazing. IR is an easy crutch for computer vision I guess. (And of course it would help a human too but it isn’t needed at all times.)

So there is no safe way that I know of for Tesla to offer this in a L2 system.
They just have to design it better and use computer vision techniques that work, when they exist.
 
  • Like
Reactions: FSDtester#1
No IR transmitters in the cars that I know of. Where do you believe they are located? They are easy to identify using your phone's video recorder and NEVER seen any from inside a Y/3.
After two repair visits, my 2022 Model Y LR now has detectable IR emitters. Two, up in the assembly above the rear view mirror. iPhone pic attached.
 

Attachments

  • YLR_IR_8Mar2023_0.jpeg
    YLR_IR_8Mar2023_0.jpeg
    28.4 KB · Views: 57
I definitely hope I am wrong. Maybe we’ll have it before April 1st! Or maybe on April 1st. They clearly will have to do another rev or two though, and that takes time.

As you know, I’m generally quite optimistic, but I do like to try to stay grounded in reality whenever I can.
Seems like maybe I’ll be wrong! We’ll see!

I guess since it is just 10.69 I should have known it was not much risk.
 
Last edited: