Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
It IS NOT L3 on the highway today.

Obviously.

Which is why the post you quote points out 2 things they'd need to do before it could be.

Tesla takes zero liability.

Nothing in SAE J3016 requires them to "take liability"

Here's a link to the full text of J3016-

The word liability doesn't even appear in the document. At all.


L3 is defined as

The sustained and ODD-specific performance by an ADS of the entire DDT under routine/normal operation with the expectation that the DDT fallback-ready user is receptive to ADS-issued requests to intervene, as well as to DDT performance-relevant system failures in other vehicle systems, and will respond appropriately


In other words, exactly what FSD does on the highway today except it would be performing the complete OEDR task instead of the limited one it does today-- and thus no longer requires the human to perform it.

(Today Tesla asks the human to perform it by actively monitoring FSD to intervene- L3 would only require you to intervene when the car specifically asks you to and you'd need to be paying enough attention to respond to such a request, but not actively monitor the system moment to moment)


Tesla has said the OEDR for city streets is incomplete and will remain so-- so that will remain an L2 system.

They've said no such thing about FSDs highway OEDR... we know the LEGACY AP code was limited in handling stopped/partly in lane vehicles- but FSD seems to be fine with this. So it's entirely possible there's a complete OEDR for highways already-- and thus, make the 2 changes I mention, and it's an L3 highway system.
 
Wish I was still on 11.3.6. For me and my 2023 M3LR HW3 that was the best version. 11.4.x introduced some very strange issues. For some reason 11.4.x makes incorrect turns. For instance while driving down the road and preparing to make a right turn 1/2 mile ahead the car chooses to make a right turn on to a street 1/4 mile early! Not sure why that is. Is that GPS issue? Seems strange. This is happening on almost every drive I have now. The car also makes more unnecessary lane changes and in some cases these lane changes prevent me from missing my turn. Hoping they fox this ASAP.

FYI, running on the average profile and without minimal lane changes selected. Going to try Chill and MLC(if I remember to select it!!!!)
 
If you are a passenger (which you are in L3 or above) then you can NOT be held liable for what the "driver" does.

You are a full-time passenger for L4/L5 since you are never asked to take over. With L3, you are a sort of "part-time passenger". You are not driving when L3 is engaged but you are still required to take over when the system requests it. So you are not a passenger all the time.

I think liability is pretty clear for L4/L5. You cannot be liable since you are never the driver in L4/L5. But I think liability is a bit trickier for L3 since you are sometimes asked to take over. Certainly, when L3 is engaged, you are not the driver. So you would not be liable when L3 is engaged. And certainly, you would be liable when L3 is not engaged since you are the driver. But what about the takeover request phase when the L3 was requesting you to take over? I think you could potentially be liable during the takeover request part if you failed to take over when prompted. That could be a grey zone where L3 was engaged but the driver was supposed to take over. I could see a manufacturer arguing that the L3 gave the driver plenty of time and warnings to take over and they failed to do so, therefore they are responsible for the accident that happened during the takeover request.

In fact, the issues with how the system should do the take over, how long the driver should be prompted to take over, what the L3 should do if the driver fails to take over, are why I think L3 will likely be abandoned. I think manufacturers will prefer the simpler approach of L2 where the driver has to supervise all the time or then L4 where you don't have to worry about a take-over at all. And once autonomous driving becomes cheap and reliable enough, I think most companies will just focus on L4 anyway and jump over L3. Why even bother with L3 and worrying about human takeover when you can just remove the driver completely from the equation? That's precisely why Waymo is only focused on L4 and ignoring L2 and L3 entirely.
 
Actually liability is ALWAYS implied to be the driver's responsibility. If you are a passenger (which you are in L3 or above) then you can NOT be held liable for what the "driver" does.

View attachment 950572

Spoiler: The nice graphics chart isn't a law.

Again nothing in J3016 at all discusses who is legally liable for accidents. That's not its scope, or purpose, and SAE does not write laws.

So there's no requirement to have the maker assume liability in order to be L3. One has literally nothing to do with the other.

I do think any car maker trying to deploy an L3 system WITHOUT taking liability would have significant problems of various sorts- but none of them would be "it isn't really L3 if you don't do that"
 
Last edited:
  • Funny
Reactions: jebinc
Again nothing in J3016 at all discusses who is legally liable for accidents. That's not its scope, or purpose, and SAE does not write laws.

Technically true. The laws would need to codify that liability, the SAE levels do not presume liability. But I think the levels do imply liability. I think right now, the law says that the legal driver is liable. For L0/L1/L2, the human is the legal driver so they would be liable. But for L4/L5, the human is a passenger, they are not the legal driver. So how could they be considered liable when they are not the legal driver? That makes no sense. And there is precedent. Passengers who ride in a Waymo are not liable. And you are not liable now if you are the passenger in a car that is involved in an accident.
 
Technically true. The laws would need to codify that liability, the SAE levels do not presume liability. But I think the levels do imply liability. I think right now, the law says that the legal driver is liable. For L0/L1/L2, the human is the legal driver so they would be liable. But for L4/L5, the human is a passenger, they are not the legal driver. So how could they be considered liable when they are not the legal driver? That makes no sense.

State laws will need to define who "the driver" is in an L3 car. Most currently do not. Even the few that do for L4+ ones require the OWNER of the car (not the maker of the car) to have liability insurance to operate it without a human.


Handwaving at SAE J3016 doesn't change that- and the original point was "assuming liability" was not required just to qualify as L3.



And there is precedent. Passengers who ride in a Waymo are not liable. And you are not liable now if you are the passenger in a car that is involved in an accident.

As I suggest above- You also don't own, or operate, a Waymo so that's a bit different.

Most states that DO allow self driving cars require the OWNER to have liability insurance. Which is Waymo. So in an accident the liability is on Waymo.

It's not on who made the car- or even who made the sensors or driving computer.
 
And, speaking as a design EE who's deeply into clocks:
  • Sampling rate of clocks in CPUs subject to temperature variations.
  • Phase variation of so-called synchronized clocks in the system.
  • Propagation delays of I/O subject to temperature variations.
  • Changes in the resistivity of FETs and bipolars in the power train: Changes the gain of the power system in response to varying input of demand power. Which changes the poles and zeroes of the control loop.
and on and on and on. As you correctly point out, variational analysis is "fun", especially when it gets to things that are analog, or look analog.
This is my last word on this thread on the subject. I left out something: What to do when things do vary.

Take an op amp filter circuit. It consists of resistors, capacitors, an op amp, feedback. The gain of the op amp might vary over a 2:1 range; the bandwidth of the op amp (the frequency above which it stops responding to the input signal); capacitors might be +-10%, resistors might be 1%. And with all that going on, the designer wants to have a filter circuit whose critical tolerances (bandwidth, cut-off frequency, ripple, fidelity in the passband, and so on) are accurate to some bunch of numbers. That, in many cases, might be required to be more accurate than one would think that the underlying components could support.

There are very definitely ways to do this. There's this idea of "sensitivity", which is the differential of some parameter of interest with respect to the component that's varying. Write sensitivity equations for the dozen or so parameters one is tracking, then change topology of the op amp circuit as needed so that, despite the variation of the components, the required accuracy of the desired parameters are maintained. Sometimes it turns out that, of the, say, three capacitors in a circuit, only one has a major impact - and that's the one that one goes out and gets the +-0.5% version of that capacitor.

Got a control circuit whose damping is critical? Even if it's digital, there's math that can be applied that handles the sensitivity analysis issues.

Most of my training on the subject was old-school: Write differential equations and solve by hand or with the help of a computer. I imagine that in the Professor's courses CAD kicks in, big time.

Having said all that: For good old standard control systems with poles and zeros, controlling all that in the presence of multitudinous varying system parameters can be done. And are done, each and every day. Doesn't mean that the neural network aspects and, well, AI aspects of FSD-b doesn't give everything the fits. But the fact that everything, including the car's components, vary doesn't mean that FSD can't be done. It just makes it tough.
 
Last edited:
Why even bother with L3 and worrying about human takeover when you can just remove the driver completely from the equation? That's precisely why Waymo is only focused on L4 and ignoring L2 and L3 entirely.
Simple. You bother if you don't believe Tesla will be able to support L4/L5 with current hardware which is the current belief of many. So L3 on the highway lets you text, watch videos and read. That would be terrific. Whether FSD would support L3 on city/streets is an entirely different question. Waymo focus is as a robotaxi so of course they would ignore L2 and L3. Thats not relevant to this conversation.
 
  • Like
Reactions: jebinc
No car in front. I did not have any problem with this ramp before. The second one when on a normal 2 lane road.
11.4.4 has more phantom brakes than earlier versions.

Were you watching adjacent lanes as well? There's one new undocumented feature of 11.4.4 that I haven't seen discussed: it actively yields to cars with their turn signals on, now.

Extremely obvious on a drive home today where a slow car to my right merged onto the highway and forgot to turn their turn signal off. FSD Beta braked down to 50 MPH to yield to them, and then continued cruising along at 50 waiting for them to merge. Had to tap the accelerator to get past.
 
Were you watching adjacent lanes as well? There's one new undocumented feature of 11.4.4 that I haven't seen discussed: it actively yields to cars with their turn signals on, now.

Extremely obvious on a drive home today where a slow car to my right merged onto the highway and forgot to turn their turn signal off. FSD Beta braked down to 50 MPH to yield to them, and then continued cruising along at 50 waiting for them to merge. Had to tap the accelerator to get past.
There was no adjacent lane. It slowed down from 45 to 20 mph. I had the impression that it's going stop. In the first case, I pressed the accelerator and it cancelled FSD. In the second case I cancelled FSD myself.
 
Last edited:
You are a full-time passenger for L4/L5 since you are never asked to take over. With L3, you are a sort of "part-time passenger". You are not driving when L3 is engaged but you are still required to take over when the system requests it. So you are not a passenger all the time.....
Actually you are a passenger as long as L3 is actively engaged. You are a driver only when L3 is disengaged. There is no ambiguity about it. L3 MUST "request" that you change from being a passenger to being the driver before liability switches back to you. You simply must be prepared to change from being a passenger to being the driver if the system needs assistance. The only open question is how long of a time frame must you be given to take over. That does need to be defined. However you can not be required to "have been" the driver post accident while L3 is active.
 
The only open question is how long of a time frame must you be given to take over. That does need to be defined. However you can not be required to "have been" the driver post accident while L3 is active.
The answer to this open question is (vaguely): "As long as it takes to stop being a passenger and start performing the OEDR". Most people argue 10-15 seconds. I should also add that the car is driving during the take over procedure and need to stop safely if the human isn't able to take over in the specified time. The human is driving only after the take over procedure is completed.
 
Actually you are a passenger as long as L3 is actively engaged. You are a driver only when L3 is disengaged. There is no ambiguity about it. L3 MUST "request" that you change from being a passenger to being the driver before liability switches back to you. You simply must be prepared to change from being a passenger to being the driver if the system needs assistance. The only open question is how long of a time frame must you be given to take over. That does need to be defined. However you can not be required to "have been" the driver post accident while L3 is active.

Yes that is what I said. That is why I said you are a "part-time" passenger with L3. You are a passenger when L3 is engaged but you cease to be a passenger when you take over. So you are only a passenger when L3 is on but not necessarily for the entire trip. This is in clear contrast with L4/L5, where you are the passenger all the time because you are never asked to take over.
 
Yes that is what I said. That is why I said you are a "part-time" passenger with L3.
Yes but just clarifying that in liability there's no such thing. At the time of a liability/accident it is one or the other. I was actually more replying to someone else. I just hate quoting them because it becomes 100% endless until you yield.
 
A bit more on the schizophrenic nature of 11.4.4.

Yesterday, ego was attempting a left hand turn on to a highway from a two lane road. There was not a car in sight, and no requirement to stop. Just a simple left with no traffic. Instead of taking a smooth turn onto the highway, ego stopped, jerked the yoke left and right back and forth for a second or three, and then slowly turned. The car behind me was not happy.

On the same drive, a few minutes later, ego made a smooth left turn at a traffic light while following other cars, and made a second smooth left turn at another traffic light while following other cars, with no problem.

It seems that ego can follow other cars with no problem, but when left taking the lead, there can be unexpected problems. I suspect that others already know about this behavior, but the lightbulb just went on for me.

Joe