Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
Disagree with this - the alternative is putting stoplights at every such intersection which is a huge expense and impacts traffic flow on the larger street. It all depends on traffic patterns. There are 2 such intersections leading out of our development so I can't avoid them but in the 10 years we've live there accidents have not been a problem so it can't really be argued that they're a hazard.

Regardless, such intersections are common throughout the U.S. so any FSD software needs to be able to manage them.
Except, Tesla's advertising for FSD has the caveat that it will be able to operate in "almost all circumstances",
and the Chuck Cook tests show that this may be one of them. Fine by me. Robotaxis have other ways to get to a destination.
 
Except, Tesla's advertising for FSD has the caveat that it will be able to operate in "almost all circumstances",
and the Chuck Cook tests show that this may be one of them. Fine by me. Robotaxis have other ways to get to a destination.
Turning from a local road onto a trunk road (i.e. county road, state highway, etc) with a significantly higher speed limit is an incredibly common occurrence, particularly if you live outside of a city. If FSD can't handle this they're essentially saying it can't be used outside of cities.
 
Turning from a local road onto a trunk road (i.e. county road, state highway, etc) with a significantly higher speed limit is an incredibly common occurrence, particularly if you live outside of a city. If FSD can't handle this they're essentially saying it can't be used outside of cities.
I think what is uncommon is for that to be very busy.

But in general I agree - FSD needs to handle this. But not when the disengagement rate is 1 in 10 miles. May be when it needs to get from 1 in 5,000 miles to 1 in 10,000 miles, it needs to handle such situations.
 
I think what is uncommon is for that to be very busy.

But in general I agree - FSD needs to handle this. But not when the disengagement rate is 1 in 10 miles. May be when it needs to get from 1 in 5,000 miles to 1 in 10,000 miles, it needs to handle such situations.

Sounds to me like FSD and the Navigation needs to have a bit of a chat before showing us directions.

Perhaps an FSD capable route, and a faster non-FSD capable route.
 
When the disengagement rate is like 1 in 4,000 miles ;)

IOW, there are a lot of things FSD needs to get right first, before handling such niche, difficult cases.
Well, I don't think it's a 'niche' case - that was the whole point of my last post. It's actually a pretty common occurrence. The last time I drove that intersection, there was no traffic at all, so the traffic issues shouldn't even have come in to play.

Of note, the other exit from my development (30 MPH road with a stop sign onto a 45 MPH road with no stop) is now showing the same 'creeping forward, press accelerator' warning. I'm pretty sure it didn't do that before and I can say for a fact that FSD has successfully navigated that intersection before so it's clearly something they're working on and actively making changes to. I'm guessing they have such intersections in a 'not ready right now, working on it' category.
 
  • Like
Reactions: edseloh
Except, Tesla's advertising for FSD has the caveat that it will be able to operate in "almost all circumstances",
and the Chuck Cook tests show that this may be one of them. Fine by me. Robotaxis have other ways to get to a destination.
Chuck’s turning is nasty for humans too .. two lanes of busy, fast moving traffic with poor lateral visibility. It’s really good that he keeps testing the car (and risking it!) :)
 
  • Like
Reactions: BitJam
The hairiest situation I encountered recently (not using FSD) was an unfamiliar "merge" where it was near impossible to see anything. The headrest and pillar block the view badly.

North on 687 to merge onto 231. That is a 22 degree angle. I doubt FSD could handle this in any sort of graceful and safe way.

merge_678-to-231.jpg
 
The hairiest situation I encountered recently (not using FSD) was an unfamiliar "merge" where it was near impossible to see anything. The headrest and pillar block the view badly.

North on 687 to merge onto 231. That is a 22 degree angle. I doubt FSD could handle this in any sort of graceful and safe way.

View attachment 780001
That's just an incredibly poorly designed intersection - a yield with no merge lane and poor visibility. I think someone at the highway department was drunk!​
 
No kidding! Tesla navigation put me on that route. On the return trip, I could see there was a safer alternative but understood why navigation chose the left turn at a light onto 687 instead of the earlier left turn onto bus29, no light, crossing two lanes of the 4 lane high traffic road.
 
No kidding! Tesla navigation put me on that route. On the return trip, I could see there was a safer alternative but understood why navigation chose the left turn at a light onto 687 instead of the earlier left turn onto bus29, no light, crossing two lanes of the 4 lane high traffic road.

It's a perfect example of why I wish there was a better navigation->fsd beta integration.

FSD Beta should be able to request an easier navigation plan.

Personally I'd love to draw my own route.
 
As a % of the total number of trips, how many will encounter such a situation.

Better still, what % of disengagements is because of this scenario ? This is what Tesla should be looking at.
well, as of right now it appears there would be 100% disengagements because it can't complete the turn without human intervention!

For me, there are 2 practical exits from my neighborhood, both of which involve such turns so if I start navigation before the turn it will happen on 100% of my trips.

I obviously can't say what percentage of roads or intersections would fall under this but I live in a development of about 500 houses. Looking at the map there are 8 possible exits and 7 of them involve turns onto a county road with cross traffic traveling 45-50 MPH (posted speed). There are also a large number of Teslas that I see in my neighborhood, FWIW.
 
Take the total of ALL disengagements using FSD by 60k teslas. How many of those disengagements would be because of such roads ? An extremely small %.

The disengagement percentage isn't going to convey an accurate story of what's happening.

When faced with such roads I typically turn it off, and I might use the brake or I might use the lever. If I use the brake it counts as a disengagement, but if I use the lever if doesn't.

If I do happen to use it I might hit the throttle, but that's not a disengagement. It's still a issue as its an intervention.

I wish Tesla had a "don't use my disengagement data as its crap, and its meaningless" check box.
 
Take the total of ALL disengagements using FSD by 60k teslas. How many of those disengagements would be because of such roads ? An extremely small %.
Yes - my comment was intended to be (at least partially) tongue in cheek.
The disengagement percentage isn't going to convey an accurate story of what's happening.

When faced with such roads I typically turn it off, and I might use the brake or I might use the lever. If I use the brake it counts as a disengagement, but if I use the lever if doesn't.

If I do happen to use it I might hit the throttle, but that's not a disengagement. It's still a issue as its an intervention.

I wish Tesla had a "don't use my disengagement data as its crap, and its meaningless" check box.
Agreed - I've posted something similar prior to this. Essentially, how many potential disengagements are not counted because people proactively turn off FSD in situations in which they know it will have problems? I'd actually expect the disengagement rate to remain somewhat stagnant through the initial FSD improvements because people would leave it on in progressively more difficult situations.

Likewise, pressing the accelerator is a significant intervention because the person pressing it is evaluating the traffic and making the decision that it's safe to proceed. Essentially, the only thing FSD is doing at that point is steering, arguably one of the easiest of the tasks to accomplish.

There have been previous discussions about how Tesla counts disengagements. Obviously no one knows, but the general conclusion is they would count braking and steering overrides but not stalk disengagements. regardless, there's no great way that won't miss a significant number.