Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
Haha... Not even... on Beta 11.4.4 and still occurs on several routes that I regularly travel. That's the problem with FSD in general. It's consistently inconsistent. I have seen some improvement in 2 yrs with the program but this and other issues like phantom braking(7+ years now?) still persist to an annoying degree. It makes me wonder if it has a reached a Hardware limitation peak that can not be addressed through the Beta team?
i should preface with note that beta is beta. However, these issues bleed into the broader autopilot community and while improved have not been resolved for a very long time...
 
Haha... Not even... on Beta 11.4.4 and still occurs on several routes that I regularly travel. That's the problem with FSD in general. It's consistently inconsistent. I have seen some improvement in 2 yrs with the program but this and other issues like phantom braking(7+ years now?) still persist to an annoying degree. It makes me wonder if it has a reached a Hardware limitation peak that can not be addressed through the Beta team?
When the speed limit changes from 75 to 55, your car hard brakes to the new speed? That's what my old AP stack used to do, but once FSD Beta took over the freeway AP stack, it slows down much more smoothly, enough that I can feel the slow down, spot the reason being the new speed limit, and wheel up the speed while only dropping maybe 5MPH.
 
No, it wouldn't be stuck. FSD will slow down to get over as well. It is possible that nobody would leave a big enough gap for it to change into, but that is unlikely. (Especially as they could see that it would break the blockade.)
It would be even worse if the purported maintain speed based on traffic flow is a real feature. The four Teslas would be stuck tracking each other.
 
How do the state laws on autonomous vehicles deal with the driver instructing it to violate a law.

We are not talking about driverless robotaxis here, but rather Tesla cars in which we can disengage FSD or intervene.

Our cars can be driven in FSD, or manually, or in a sort of hybrid mode when we intervene with speed setting changes or go-pedal nudges. Perhaps the FSD defaults need to comply, but manual interventions are allowed, even if technically illegal.
 
How do the state laws on autonomous vehicles deal with the driver instructing it to violate a law.

We are not talking about driverless robotaxis here, but rather Tesla cars in which we can disengage FSD or intervene.

Our cars can be driven in FSD, or manually, or in a sort of hybrid mode when we intervene with speed setting changes or go-pedal nudges. Perhaps the FSD defaults need to comply, but manual interventions are allowed, even if technically illegal.
That's as yet an unknown with no legal precedent yet (i talk like i know, and i don't). But I think the driver is still responsible.
 
How do the state laws on autonomous vehicles deal with the driver instructing it to violate a law.

We are not talking about driverless robotaxis here, but rather Tesla cars in which we can disengage FSD or intervene.

If the human disengages it's not autonomous.

Autonomous means the car is driving. When it's doing that it must obey all traffic laws.

When it's NOT doing that (ie human does something) then whatever is done is on the human- who can choose to break the law of course.


Our cars can be driven in FSD, or manually, or in a sort of hybrid mode

Nope.

There is never, ever, any point at which the human is not, legally, the driver of the car in any Tesla currently, even if the L2 ADAS system is handling certainly specific parts of the dynamic driving task.

To my knowledge no states impose any specific regulations on L2 ADAS systems that aren't actually the driver of the vehicle.
 
If the human disengages it's not autonomous.

Autonomous means the car is driving. When it's doing that it must obey all traffic laws.

When it's NOT doing that (ie human does something) then whatever is done is on the human- who can choose to break the law of course.
I guess I was not clear. My question was in regards to the fears expressed here that strict adherence to laws by an L3+ system may make a future version of FSD unsafe.

So, what I meant to ask was not about liability, but rather to regulations which might apply if FSD does become L3+. My hunch is that it may not be a problem if the regulations acknowledge that any intentional driver interventions instantly shift criminal and civil responsibility to the driver.

This earlier post is what got me thinking about this:
AFAIK every single state that has legalized self-driving vehicles requires the car maker to certify the system obeys all traffic laws. Which includes speed limits.
The response to the following confused me. I said:
Our cars can be driven in FSD, or manually, or in a sort of hybrid mode...
And you replied:
As Anan Greespan is reported to have said:

I know you believe that you understand what you thought I said, but I'm not sure you realize that what you heard was not what I meant.​

Your point that we are always responsible is well understood as it applies to current FSD. I'm curious about what it'll be like if it ever makes it to a level up.
 
  • Like
Reactions: pilotSteve
That's as yet an unknown with no legal precedent yet (i talk like i know, and i don't). But I think the driver is still responsible.
Clearly the current robotaxi operations are regulated. With no driver in the car, it is certainly not the passengers who bear responsibility. My question probably relates to those regulations, and how they will apply to FSD if it becomes L3 or higher.

A hypothetical: You are in your L4 Tesla Model Z, and L4 FSD drives you toward a head-on collision with another. You slam on the brakes and succeed in slowing down some and you survive. However, someone in the other car dies in spite of your heroic intervention. Since you disengaged FSD, are you liable? Would not hitting the break have avoided that liability, but at great risk to or loss of your own life?

I expect that Tesla records and uploads details like pedal pressure so they can prove driver error.
 
So, what I meant to ask was not about liability, but rather to regulations which might apply if FSD does become L3+. My hunch is that it may not be a problem if the regulations acknowledge that any intentional driver interventions instantly shift criminal and civil responsibility to the driver.
I think there is a disconnect between L2 and L3 in your description. L3+ the driver cannot intervene except to disengage the system and take over the driving task. I think that is where some people are getting hung up on the logic. They think they can engage L3 mode (where Tesla takes liability and the system is driving) and also set parameters such as overriding the speed limit, or pressing the accelerator to intentionally speed up or close a gap. That is not the case. The only options are disengaging the system via a button or stalk, or pressing the brake pedal.
 
I think there is a disconnect between L2 and L3 in your description. L3+ the driver cannot intervene except to disengage the system and take over the driving task. I think that is where some people are getting hung up on the logic. They think they can engage L3 mode (where Tesla takes liability and the system is driving) and also set parameters such as overriding the speed limit, or pressing the accelerator to intentionally speed up or close a gap. That is not the case. The only options are disengaging the system via a button or stalk, or pressing the brake pedal.
Frankly, I'd be perfectly happy with L3 with the appropriate take over logic by Tesla. Don't care much about L4/L5 since I'd rather not wait a few years and another hardware iteration. For me robotaxi is meaningless in the near term. Picked up my new MY today so will have to wait to use L2 FSD again. Was surprised at how poor basic NoA was just driving on the highway and regular roads. Forget how much FSD has improved on just the simple stuff.
 
  • Like
Reactions: swedge
Another point to discuss is moving through the SAE level tree. Currently, as far as I know from demo videos with Mercedes, which currently has L3 vehicles on the road, the car does not move through the SAE level tree on its own. In other words, when the L3 ODD is breached, the car does not automatically move down to L2. It terminates L3 and hands control back to the driver after the mandatory warning period. The driver can then, if they want to, enable the L2 feature to continue ADAS functions, while remaining in control of the vehicle (just like Tesla ADAS features). Similarly, if the driver is in L2 mode, the car will not automatically switch to L3 when it enters its L3 ODD. The driver must initiate the L3 system.

This is likely to avoid any confusion over who is in control of the car, and who is legally liable/responsible for the driving task.
 
I noticed that a while ago (think I may have posted about it?) First at an intersection that had temporary stop signs placed due to construction but I've noticed it at other places as well. I don't know why I didn't think of it before but I just completed a camera calibration. I haven't been back to the problem intersection yet but the few stop signs I've encountered have been visualized correctly so we'll see how it does.
Just an update - I recalibrated my cameras and there's no change. FSD has double vision with stop signs and frequently misplaces them. This correlates with poor stopping performance. It's pretty clear the current version of FSD has significant perceptual issues.
 
Yes, You are not means you can not control as long as engaged.

View attachment 965224
 
  • Like
Reactions: Pdubs
I live in SoCal and go to Los Angeles a lot. I never go much more than 5 over the speed limit. At 70 mph I am passing some and some are passing me, 70 mph is the sweet spot. Traveling at 70 mph has never been a problem eventhough some (not many) people go 10 and 15 above the speed limit. A lot of people go the speed limit. The danger is when it rains, some idiots just don't slow to a safe speed.

With that said, maintaining speed for traffic flow is potentially dangerous. Many things come to mind. Speed is lane dependant with slower speeds in the right lanes. Is the car gonna match the speed of someone trying to pass you ... road rage scenario. Is the car gonna get in the fast lane and pass everyone or is that against maintaining speed? Is this the end of street racing where there's no winner because the car is matching your opponent as he speed shifts through the gears? If this is a real feature, and not a YouTuber fabrication, the behavior should be documented and overideable.
I live in the LA area as well and agree with you. My problem is not so much keeping up with traffic as I set my speed at 73 which is a real speed of 72 which is the usual traffic flow. My problem is I don't like when traffic is completely or almost stopped ahead on the highway. I see the break lights way in advance and slow down then. If I waited for FSDb it would wait too long to break and break too hard. If there are cars or a car way behind they would most likely hit me.
 
  • Like
Reactions: zoomer0056
When the speed limit changes from 75 to 55, your car hard brakes to the new speed? That's what my old AP stack used to do, but once FSD Beta took over the freeway AP stack, it slows down much more smoothly, enough that I can feel the slow down, spot the reason being the new speed limit, and wheel up the speed while only dropping maybe 5MPH.
I misinterpreted the use case that you had described. I experienced a different issue. The car would always hard brake in downtown san francisco on Hwy 101 when I would be on a freeway overpass with speed limit of 55. FSD would abruptly without warning hard brake and change the speed limit down to 25 and I suspect it had something to do with the roadway under the overpass with speed limit of 25. I assumed it positioned the coordinates of the car as traveling on the road under the overpass.
 
  • Like
Reactions: Dewg