Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
Oh… I think I know why. You haven’t been worshipping at the alter of His High Holiness Musk every evening, now have you? Let me read a passage from His Holiness’s Scriptures, King Musk Version:

Book of Tesla, Chapter 1, verse 69:

”Woe beith to the soul who failith to praise his High Holiness Musk, the father, son, and spirit of FSD, every night betweenith thine hours of 6:13 pm and 6:29 pm every night, for he shall be cast into the depths of FSD hell and no strikes shall ever be removed withoutith he shall waitith the mandatory period, as definithed by his High Holiness”.

Say amen brother.

Joe
It’s FSDj…. 🤣
 
I had an interesting thing in a parking lot today. There are three speed bumps in a row, big ones. It slowed down to 5 MPH for the first one, then stayed really slow and cagey, and went around the second by driving through the empty parking spots on the right. (I normally do that as well if the spots are empty, and the car in front of me did it too.) Did it just follow the car that was, way, in front of me or did it do it on its own? I went around to try again but someone had parked in one of the spots blocking that path.
 
  • Informative
Reactions: Electroman
I had an interesting thing in a parking lot today. There are three speed bumps in a row, big ones. It slowed down to 5 MPH for the first one, then stayed really slow and cagey, and went around the second by driving through the empty parking spots on the right. (I normally do that as well if the spots are empty, and the car in front of me did it too.) Did it just follow the car that was, way, in front of me or did it do it on its own? I went around to try again but someone had parked in one of the spots blocking that path.
Well, FSDj currently doesn’t have A.S.S., only “dumb and dummer” summon. The Great Charlatan mentioned A.S.S. may arrive with his pronounced pot smoking induced 11.420 release.
 
We are always showing how the B-Pillar's struggle and only occasionally about how the center cameras can be occluded. Here is an example that happened yesterday (11.4.2). Behind a bus and it stopped and Ego was extremely aggressive assuming immediately it was at a Bus Stop and not just a bus stopping. I could see around (since I can pivot my head) and knew there was traffic coming. However Ego could not see at all and just pulled out immediately on “faith”. I let it play because there was JUST enough room (Sunday morning low traffic) but I did touch the Go Peddle to make it less annoying.

I would have probably waited on the cars since I could see all of them coming. But clearly Ego can't see what is coming.

 
  • Like
Reactions: Mark II
Owner/operator is a contradiction of terms if L3-L5 is engaged since you are NOT the operator. So you have an L4 vehicle and it is driving and you are reading book (or in the passenger or back seat napping). Your car runs over and kills a pedestrian. So you the owner are at fault/liable and may even spend time in prison or on parole for an involuntary manslaughter conviction. Also you will have your driving record premaritally blemished even in a minor accident plus the possibility of being sued personally and your insurance NOT covering you (since you were NOT the driver)?

Please STOP filling this thread with nonsense discussion about liability.

Your understanding of liability is fundamentally mistaken.

So all your points argued from that are flawed and just useless noise in this thread.

If I turn on a machine, I am the operator.

Period. Full stop.

Even if that machine is intended to run while I walk away. Even if I get on an airplane and fly around the world I am STILL the operator.

But even the operator is a small part of liability.

If a stranger came over and turned on that machine without my permission or knowledge, I would STILL share liability.

With liability, it’s not zero-sum. There’s plenty to go around.

As auto-driving cars inevitably kill/maim more and more people (lots fewer than humans), you can be damn sure there will be tons of lawsuits in all jurisdictions. Against the owners, the person in the driver seat, Tesla, etc.

Every nook and cranny of liability will be tried and appealed for years.

THAT years of court precedent will determine the liability of automated vehicles. It will be 30lbs of leather-bound case law before the dust settles.

The useless faffing in this thread trying to sound definitive about who will be liable is crayon scribbling on a Denny’s menu.
 
Please STOP filling this thread with nonsense discussion about liability.

Your understanding of liability is fundamentally mistaken.

So all your points argued from that are flawed and just useless noise in this thread.

If I turn on a machine, I am the operator.

Period. Full stop.

Even if that machine is intended to run while I walk away. Even if I get on an airplane and fly around the world I am STILL the operator.

But even the operator is a small part of liability.

If a stranger came over and turned on that machine without my permission or knowledge, I would STILL share liability.

With liability, it’s not zero-sum. There’s plenty to go around.

As auto-driving cars inevitably kill/maim more and more people (lots fewer than humans), you can be damn sure there will be tons of lawsuits in all jurisdictions. Against the owners, the person in the driver seat, Tesla, etc.

Every nook and cranny of liability will be tried and appealed for years.

THAT years of court precedent will determine the liability of automated vehicles. It will be 30lbs of leather-bound case law before the dust settles.

The useless faffing in this thread trying to sound definitive about who will be liable is crayon scribbling on a Denny’s menu.
It’s sort of comical that you are complaining about other’s liability thought posts when yours is one of the longest! We did try to point folks to the autonomy “Primer” thread for these details discussions, but people will be people. 🤣
 
Please STOP filling this thread with nonsense discussion about liability.

Your understanding of liability is fundamentally mistaken.

So all your points argued from that are flawed and just useless noise in this thread.

If I turn on a machine, I am the operator.

Period. Full stop.

Even if that machine is intended to run while I walk away. Even if I get on an airplane and fly around the world I am STILL the operator.

But even the operator is a small part of liability.

If a stranger came over and turned on that machine without my permission or knowledge, I would STILL share liability.

With liability, it’s not zero-sum. There’s plenty to go around.

As auto-driving cars inevitably kill/maim more and more people (lots fewer than humans), you can be damn sure there will be tons of lawsuits in all jurisdictions. Against the owners, the person in the driver seat, Tesla, etc.

Every nook and cranny of liability will be tried and appealed for years.

THAT years of court precedent will determine the liability of automated vehicles. It will be 30lbs of leather-bound case law before the dust settles.

The useless faffing in this thread trying to sound definitive about who will be liable is crayon scribbling on a Denny’s menu.
Will depend on jurisdiction as well. This is what's planned in the UK at the moment:

"The legislation will build on existing laws, and state that manufacturers are responsible for the vehicle’s actions when self-driving, meaning a human driver would not be liable for incidents related to driving while the vehicle is in control of driving."
 
  • Like
  • Informative
Reactions: Mark II and GSP
Will depend on jurisdiction as well. This is what's planned in the UK at the moment:

"The legislation will build on existing laws, and state that manufacturers are responsible for the vehicle’s actions when self-driving, meaning a human driver would not be liable for incidents related to driving while the vehicle is in control of driving."
That means that the UK won't get FSD for a long time unless level 2 is not considered "self-driving".
 
But again the original, untrue, claim was that L3 "requires" the maker of the driving system to assume responsibility for accidents. That simply ain't so, because the SAE definition of L3 does not mention, or care, about legal liability.
Yes. Would anyone in their right mind use it or buy it if the manufacturer didn't put up guarantees though?

"Yes, you don't have to pay attention, but it may fail at any time"...

I would never ever trust such a system. I believe strongly that the free market will sort this one out for us.
 
I've noticed a huge bump in confidence when making a turn from a light or stop sign. Most of the signs it stops up far enough for me to see past obstructions to double check it's decision, some signs it stops a little bit back but still proceeds with maximum confidence even though I can't see cross traffic, I assume that it wouldn't do that unless it was able to determine the path was safe, it wouldn't be programmed to just blindly take off, would it? Lol
 
Yes. Would anyone in their right mind use it or buy it if the manufacturer didn't put up guarantees though?

"Yes, you don't have to pay attention, but it may fail at any time"...

I would never ever trust such a system. I believe strongly that the free market will sort this one out for us.


I don't disagree with any of that-- but the original claim being debunked was that liability transfer was REQUIRED for a system to be L3, and that just ain't so.

The context was a post suggesting Tesla could do L3 on highways right now with just a couple of specific changes to the code-- and then someone replied insisting it can't be L3 until Tesla takes legal liability.

SAE levels don't address or care about legal liability at all.
 
That means that the UK won't get FSD for a long time unless level 2 is not considered "self-driving".
It means there will be an objective way to tell if the car was in autonomous driving mode when the accident happened, and so the insurance can recoup its losses from the manufacturer if the car was at fault. It makes it the responsibility of the manufacturer whether to allow the driver to not monitor the car's behaviour. It also makes it easier for the unsuspecting consumer, and not just in the UK, to know if he is buying a car that is actually capable of autonomous driving as far as its manufacturer is concerned.
 
Don't assume - check and be ready to stop. It can definitely make mistakes.
It can be hard to check with the way It works now. Stops at a sign with a brick building to the left and a bush to the right. Stops at the line, creeps forward a bit, I still can't see, but I assume (hope?) It can see as it darts out with a brisk acceleration. I have my foot near the brake but I was just wondering if it can see what I can't, or it's pulling out on hopes and dreams.
 
11.4.4 for first time yesterday on 120 miles of highway got 2 very hard PB's in high traffic. Scary. Disappointed. They were gone or very light on 4.2 and 4.3.
"For the first time" is the issue. Every new install needs a few days to settle which eliminates a lot of steering wheel spaz and phantom braking. I have never find my first drive to be impressive or comfortable. Sometimes not even usable. However a week in my drives are always 100% better.
 
"For the first time" is the issue. Every new install needs a few days to settle which eliminates a lot of steering wheel spaz and phantom braking. I have never find my first drive to be impressive or comfortable. Sometimes not even usable. However a week in my drives are always 100% better.
Yea, that's interesting. I have sort of felt that way, but wasn't sure if it was just me or just my car or what. Any idea why this might be the case?
 
Yea, that's interesting. I have sort of felt that way, but wasn't sure if it was just me or just my car or what. Any idea why this might be the case?
My guess it's a combination of some internal calibration plus some super cautious training wheels enabled that is time sensitive and slowly relax itself after a week.

Sometimes just resetting the MCU will eliminate all the problems. But the car is definitely crazy the first few drives. Brake slaming is very common.
 
  • Like
Reactions: RabidYak
Finally got out and about with 11.4.4. Not a long trip to and fro to retrieve a jacket, but interesting.
  • Unprotected left turn from an unmarked road with a stop sign onto a busy two-lane road. It stopped at the sign, "crept" (a bit faster than that) to the created blue stop line, and waited patiently. This time of day it's not unusual to wait five minutes for a gap to turn up. The visualization was interesting: Car's FSD-b was keeping an eye on the passing cars for an opening. The cars were colored blue, which was all of them. No jumping around, just patient waiting. (After reading all the comments, I was expecting suicide, so had my foot poised over the brake.) After a few minutes the traffic coming from the left paused. There was a mid-sized gap on the lane coming from the right. The car began to ease up slowly (and I was getting set to gas it) and it gassed it itself. The gap thus inserted would have been one that I would have used; so this is a pass.
  • Next intersection has an off-ramp from an interstate that has a stop sign, right turn only. This is followed by an off-ramp onto a major secondary highway. The NoA path was onto the secondary highway. A car was waiting at the light and, with space to spare before I got there, the driver pulled out and headed straight down the road I was on. A semi with trailer pulled out and went for the off-ramp, blocking access to the ramp for FSD-b. In previous iterations of this kind of thing the car would more or less give up. This time, it slowed, waited for the semi to pass, and then followed the semi onto the ramp. Good as a human and an improvement over before. Another pass.
  • This next, not so much: This off ramp gets its own lane and, once on the major secondary highway, runs almost immediately into a traffic light controlled intersection. Straight ahead is the world's shortest "merge" lane; the idea is that, if one is going straight, one has to move one lane to the left almost immediately. FSD-b in the past has gotten confused here and, thinking it's got nowhere to go, turns right into the shopping center. If traffic is lighter, it'll get over sooner and not get stuck. This time, heavy traffic: It started blinking its left turn signal for the merge it wanted to make; then, after 20 yards, turned it off and attempted a right. Intervened. So.. A D-. It at least tried to get over, but didn't succeed. And if it had left that turn signal on, the locals who know the area would have let the car in, just like any other car that gets stuck there.
  • At this point stayed in the right lane of a 3-lane road. Only a couple-three miles; heavy traffic, so I'll give the car a pass on not wanting to move to the left a lane. Besides, there were a lot of cars to the left; heavy traffic.
  • Turned right into a parking lot. Not slowly, which was good. Pass.
  • Return trip: Initial big turn is an unprotected left from a lined, two-lane road with a stop sign onto a four-lane road (two lanes each direction). No traffic coming from the right, but there was moving traffic going leftwards on both lanes on the far side. Waited for an opening, then went for it. Stayed in the left lane which immediately opened up a lane.. and stayed in the lane it was in. Which was correct: Two lanes turn left here. When the light changed, moved with the flow and got into the far right lane, perfectly legal. Pass.
  • On the way back, signaled a couple of times that it wanted to move into the faster center lane, then cleared the signal when Jersey Drivers zoomed on up to block. After a bit, no traffic zooming, car took the move to the center lane. Not bad. Pass.
  • In the center lane; there's a right off-ramp coming up. Was within 0.6 miles with a Great Big Gap and the car still wasn't moving over. Did it myself. It might have done it on its own.. C-.
Rest of the trip was without incident, including dodging a Fedex truck parked on the right.

Conclusion: After a few days, it seems to be working as well as or a little better than 11.4.3. Have to take some longer trips.
 
Last edited: