Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
Tried FSD again today, and as per usual disabled it within minutes because I just can’t deal with the erratic and simply incorrect / inappropriate behavior. And this was in not particularly difficult conditions.

It makes me wonder if Tesla is even doing simulations before releasing. It doesn’t feel like it. If they are, then they don’t know how to do regression testing yet. Well, they don’t know how to run simulations at all yet.

Fingers crossed that something improves by V12.
As has been said many times . that's why it's' beta.
 
The hilarious-not-so-hilarious part about this well-written post is that we aren’t getting paid to do this job for Tesla, quite the opposite: WE PAID THEM to be beta testers and improve their product!

I’m not saying FSD should be free once it’s truly FULLY SELF DRIVING (at L4 or whatever level makes sense), but until then it’s difficult to swallow that we’ve paid so much for a largely unfinished product (referring to city streets AP).

I think this is why there’s so much debate about all the slow progress etc. I’ll bet that if Tesla made FSD beta free until it was ACTUALLY FSD (L4 or whatever), there would be a lot fewer complaints here.
Having been on many beta tests in my life, I can assure you that even if you PAID people to test it, they would still complain.
 
Interventions are accelerator or brake applications for blocking traffic.

Disengagements are steering overrides (edit: or panic brake events) to prevent a likely collision (like pulling out into traffic in front of an oncoming truck, for example).

Anyway all seven segments are very typical urban LA driving with nothing bizarre thrown in for fun. It remains absolutely useless and is not becoming more useful over time in any meaningful way.

We’ll see what 11.4.69.420.80085 brings whenever it gets to HW4 on my new car.


I appreciate the clarification, but makes your results even odder to my own experience... what you describe as interventions (the car is mostly doing a safe thing, it's just doing one that's annoyingly slow/cautious to myself or other drivers, or picking a wrong lane because that bit of the software is terrible but nobody's gonna die over it) being only barely more frequent than your rate of actual this-would-cause-an-accident disengagements is not at all my own experience.

I still see a decent # of the first one-- but also being in NC rather than LA I can mostly let the car just do its thing or tweak the accelerator a bit if someone does happen to be behind me and it's a pretty meh event.... actual "likely accident if I do nothing" disengagements are very rare in comparison to that first kind for me.

They're absolutely NOT zero- but at least order-of-magnitude less common-- whereas in your data going back 5 versions they average out to almost exactly equally common.
 
  • Like
Reactions: PACEMD
I appreciate the clarification, but makes your results even odder to my own experience... what you describe as interventions (the car is mostly doing a safe thing, it's just doing one that's annoyingly slow/cautious to myself or other drivers, or picking a wrong lane because that bit of the software is terrible but nobody's gonna die over it) being only barely more frequent than your rate of actual this-would-cause-an-accident disengagements is not at all my own experience.

I still see a decent # of the first one-- but also being in NC rather than LA I can mostly let the car just do its thing or tweak the accelerator a bit if someone does happen to be behind me and it's a pretty meh event.... actual "likely accident if I do nothing" disengagements are very rare in comparison to that first kind for me.

They're absolutely NOT zero- but at least order-of-magnitude less common-- whereas in your data going back 5 versions they average out to almost exactly equally common.
I couldn’t drive for 5 minutes in Seattle without my car doing something actually dangerous. I’ve posted videos of this. The car running red lights, stopping for red lights and then starting to go, slamming its brakes on because it can’t determine the car has enough room etc. It’s crap.
 
I couldn’t drive for 5 minutes in Seattle without my car doing something actually dangerous. I’ve posted videos of this. The car running red lights, stopping for red lights and then starting to go, slamming its brakes on because it can’t determine the car has enough room etc. It’s crap.


Apologies since you say you've posted videos- but do you have any showing it repeatedly trying to run them over and over at different lights every few minutes (rather than say 1 place it has an issue every time)?

Is there something weird about red lights in Seattle?

Because, to be clear, I accept it can run a red light... We've seen the occasional example- and it even did it to me..... once.... at an intersection that was very weirdly angled and you don't clearly see it till the last second (normally there'd be a yield sign there in most cases like it rather than a light).

But just that once, with the sharp angle.

Whereas you seem to describe a situation where it runs them every less-than-5-minutes all the time.

That's pretty clearly NOT typical or there'd be actual reported accidents from it, and quite a few at that given like 400k people using it.
 
Apologies since you say you've posted videos- but do you have any showing it repeatedly trying to run them over and over at different lights every few minutes (rather than say 1 place it has an issue every time)?

Is there something weird about red lights in Seattle?

Because, to be clear, I accept it can run a red light... We've seen the occasional example- and it even did it to me..... once.... at an intersection that was very weirdly angled and you don't clearly see it till the last second (normally there'd be a yield sign there in most cases like it rather than a light).

But just that once, with the sharp angle.

Whereas you seem to describe a situation where it runs them every less-than-5-minutes all the time.

That's pretty clearly NOT typical or there'd be actual reported accidents from it, and quite a few at that given like 400k people using it.
As mentioned before, it's possible that the large differing membership experience is due to not only location, but also model. I have the same MSP as @WilliamG - and my experience is very similar. It seems like models 3/Y do better with FSDj, than refreshed X/S models.
 
Apologies since you say you've posted videos- but do you have any showing it repeatedly trying to run them over and over at different lights every few minutes (rather than say 1 place it has an issue every time)?

Is there something weird about red lights in Seattle?

Because, to be clear, I accept it can run a red light... We've seen the occasional example- and it even did it to me..... once.... at an intersection that was very weirdly angled and you don't clearly see it till the last second (normally there'd be a yield sign there in most cases like it rather than a light).

But just that once, with the sharp angle.

Whereas you seem to describe a situation where it runs them every less-than-5-minutes all the time.

That's pretty clearly NOT typical or there'd be actual reported accidents from it, and quite a few at that given like 400k people using it.
I’m not saying it’s running red lights all the time. I’m using it as a data point to state that the car makes unsafe maneuvers on a daily basis. No, that’s not accurate. On a trip basis.
 
I don’t think any of the FSDb versions could be considered ‘safe.’ They’re clearly labeled as betas, you have to click an extra agreement to use them and they disengage if you don’t pay attention. Not sure what the point of debate is here.

In general they have been getting progressively better and safer, but it’s also been generally agreed that 11.4.4 is worse than the previous version.
 
I use FSDb for the majority of my drives. I don’t keep a detailed log of interventions/disengagements. Subjectively it’s steadily gone down through the versions with the exception of the most recent 11.4 release. The majority of my interventions are accelerator presses done out of courtesy - situations where my experience is that the car will handle things fine, just slower than everyone around me would like.

I drove from work to an appointment this morning - 30 min, two disengagements, both of which were for cloverleaf exits. Something that FSD handles poorly and MPLS unfortunately has a lot of.

I also routinely drive 150 miles to our cabin - a combination of local roads, feeder roads, state highways and interstates. FSDb generally handles this drive with 0 or 1 intervention.

Is it perfect? No. Is it useless? No.
 
Can you cite any accidents caused by V11.x releases to support the claim it's actually unsafe versus, say, the more likely explaination that "some especially nervous drivers FEEL it's unsafe despite having had no accidents with it"?





Hugely YMMV of course.

I agreed earlier lane selection, particularly, can be poor and indeed unpredictable- but almost always in a way quite trivially easy to correct for in my experience at least.

Uncomfortable has NOT been my experience, other than as I mentioned I wish it wasn't quite so slow around stop signs, and maybe that it accelerated from red lights a bit quicker....but you've got guys like Alan who repeatedly claim the braking is too harsh or whatever- as I say very subjective stuff there and you'll never make EVERYONE happy on that kind of thing so hopefully we get more granular behavior settings on this stuff eventually.




To me it's been more relaxing than manual driving for a long time now, even if a small % of the time I need to do something manually- that still means most of the time I DO NOT have to other than monitoring- which I'd be doing anyway if driving manually.... this despite remaining VERY far from >L2 outside highways. Course I also paid a lot less than 15k for it (so did most people).

Of course unsafe isn't limited to accidents alone and there's no shortage of unsafe instances reported - some of which include road rage responses to FSD SNAFUs. It's not just FSD equipped vehicle occupants complaining about unsafe actions.

I don't follow the NHTSA reports but I frequently hear it referenced for FSD related accidents and deaths.
 
  • Informative
Reactions: jebinc
Other disruptions

Locksmiths typically make auto keys more than any keys
Apple and Android was hoping to be the digital keys

Tesla is disrupting locksmiths by EVs not needing auto keys any longer
Apple missed this opp and we don’t need them to be a digital key

Amazing
 
I appreciate the clarification, but makes your results even odder to my own experience... what you describe as interventions (the car is mostly doing a safe thing, it's just doing one that's annoyingly slow/cautious to myself or other drivers, or picking a wrong lane because that bit of the software is terrible but nobody's gonna die over it) being only barely more frequent than your rate of actual this-would-cause-an-accident disengagements is not at all my own experience.

I still see a decent # of the first one-- but also being in NC rather than LA I can mostly let the car just do its thing or tweak the accelerator a bit if someone does happen to be behind me and it's a pretty meh event.... actual "likely accident if I do nothing" disengagements are very rare in comparison to that first kind for me.

They're absolutely NOT zero- but at least order-of-magnitude less common-- whereas in your data going back 5 versions they average out to almost exactly equally common.
I’ll give you one example on segment one of my test route which, in two years, the car has never successfully navigated.

The car is supposed to stop at the first intersection, proceed to the next intersection and make a right turn (orange line). This is all clearly marked with clear signage.

The two blue Xs are where FSDb just kind of stops, usually indefinitely. If I don’t press the accelerator or disengage I’ll just kind of finish out my life there I guess.

The yellow line is the path the car typically takes. That gap to the right of the lane is marked perpendicular parking so if nobody is parked there I’ll kind of let FSD fumble around to see what happens but more often than not it lunges towards parked cars and I disengage. Turning right from the second blue X is illegal so I typically disengage if that’s where the car is headed so I don’t get a ticket.

Anyway that’s the first 90 seconds of my test.

IMG_5369.jpeg
 
Can you cite any accidents caused by V11.x releases to support the claim it's actually unsafe versus, say, the more likely explaination that "some especially nervous drivers FEEL it's unsafe despite having had no accidents with it"?





Hugely YMMV of course.

I agreed earlier lane selection, particularly, can be poor and indeed unpredictable- but almost always in a way quite trivially easy to correct for in my experience at least.

Uncomfortable has NOT been my experience, other than as I mentioned I wish it wasn't quite so slow around stop signs, and maybe that it accelerated from red lights a bit quicker....but you've got guys like Alan who repeatedly claim the braking is too harsh or whatever- as I say very subjective stuff there and you'll never make EVERYONE happy on that kind of thing so hopefully we get more granular behavior settings on this stuff eventually.




To me it's been more relaxing than manual driving for a long time now, even if a small % of the time I need to do something manually- that still means most of the time I DO NOT have to other than monitoring- which I'd be doing anyway if driving manually.... this despite remaining VERY far from >L2 outside highways. Course I also paid a lot less than 15k for it (so did most people).
For what it's worth, here's a unique case. Unknown version. Stuff like this really needs a separate thread.

 
  • Funny
Reactions: jebinc
I don’t think any of the FSDb versions could be considered ‘safe.’ They’re clearly labeled as betas, you have to click an extra agreement to use them and they disengage if you don’t pay attention. Not sure what the point of debate is here.

I think my point is it's hard to have useful discussion if we can't even agree on definitions.

The fact we've seen folks who routinely disengage in situations others do not-- and vice versa (and neither group appears to be getting into accidents) suggests that at least SOME of the things some people here report as "dangerous safety disengagement" really wasn't. Certainly some ARE though, but not as many as they appear to be reporting- they were just uncomfortably with the behavior even if it was objectively "safe" to not intervene in the situation so they choose to intervene and then it gets classed in their mind as a safety disengagement.


The majority of my interventions are accelerator presses done out of courtesy - situations where my experience is that the car will handle things fine, just slower than everyone around me would like.

Same for me fora long time.... though I'd say honestly "dove into wrong lane for the route" interventions have gotten a lot higher in recent versions too. Those still aren't typically safety interventions- I wouldn't get into an accident if I left it alone-- but I might well end up off-route and taking longer to get where I'm going as it figures that out so I intervene to get it back in the correct lane. Fantastically annoying, but not dangerous.


Of course unsafe isn't limited to accidents alone and there's no shortage of unsafe instances reported - some of which include road rage responses to FSD SNAFUs. It's not just FSD equipped vehicle occupants complaining about unsafe actions.


That's the point of the driver being required to pay attention though. If there's no FSD accidents (or essentially none, statistically) that suggests it's safe when used correctly... even if it makes you personally nervous enough to intervene often.

That does not mean it's safe to use WITHOUT paying attention of course. Tesla has explicitly told us the car lacks an OEDR capable of that- that's the exact reason it needs the human there. You can see that in the "hydroplanes into pond" video posted above---- FSD was just driving along straight--it had no idea what the "flooded" sign means... but the human should have and interened.



I don't follow the NHTSA reports but I frequently hear it referenced for FSD related accidents and deaths.

I think you think you hear that.

to my knowledge there's been 0 actual deaths on FSD- though any number of "reports" that initially claimed that and then turned out to be an idiot human behind the wheel at fault...like the infamous Texas Model S crash where the cop initially was SURE that self-driving was to blame and it absolutely wasn't involved at all in the end..... plus the cases of basic AP being called FSD and being blamed for the stuff like hitting emergency vehicles when it was the legacy code involved after all.



And the only "accidents" I'm aware of are on fsdb proper are:

1 self-reported alleged one nobody can otherwise find record of (a sideswipe I think)
and
1 youtuber who hit a couple of those little parking posts going around a corner/turn.


I suppose we can add the pond video to that list though it's clearly a case where the human should've done something and was given a big orange sign telling them to do so- so not sure we ought count that one... the other 1 (or 2 if you believe the self report with no official evidence guy) out of 400,000 people using it are cases where the human wouldn't have had sufficient warning/time/reason to intervene so I'd certainly blame that 1 (or possibly 2) on FSD. Which is a pretty crazy low rate of accidents.
 
  • Like
Reactions: DrGriz and PACEMD
I think my point is it's hard to have useful discussion if we can't even agree on definitions.

The fact we've seen folks who routinely disengage in situations others do not-- and vice versa (and neither group appears to be getting into accidents) suggests that at least SOME of the things some people here report as "dangerous safety disengagement" really wasn't. Certainly some ARE though, but not as many as they appear to be reporting- they were just uncomfortably with the behavior even if it was objectively "safe" to not intervene in the situation so they choose to intervene and then it gets classed in their mind as a safety disengagement.




Same for me fora long time.... though I'd say honestly "dove into wrong lane for the route" interventions have gotten a lot higher in recent versions too. Those still aren't typically safety interventions- I wouldn't get into an accident if I left it alone-- but I might well end up off-route and taking longer to get where I'm going as it figures that out so I intervene to get it back in the correct lane. Fantastically annoying, but not dangerous.





That's the point of the driver being required to pay attention though. If there's no FSD accidents (or essentially none, statistically) that suggests it's safe when used correctly... even if it makes you personally nervous enough to intervene often.

That does not mean it's safe to use WITHOUT paying attention of course. Tesla has explicitly told us the car lacks an OEDR capable of that- that's the exact reason it needs the human there. You can see that in the "hydroplanes into pond" video posted above---- FSD was just driving along straight--it had no idea what the "flooded" sign means... but the human should have and interened.





I think you think you hear that.

to my knowledge there's been 0 actual deaths on FSD- though any number of "reports" that initially claimed that and then turned out to be an idiot human behind the wheel at fault...like the infamous Texas Model S crash where the cop initially was SURE that self-driving was to blame and it absolutely wasn't involved at all in the end..... plus the cases of basic AP being called FSD and being blamed for the stuff like hitting emergency vehicles when it was the legacy code involved after all.



And the only "accidents" I'm aware of are on fsdb proper are:

1 self-reported alleged one nobody can otherwise find record of (a sideswipe I think)
and
1 youtuber who hit a couple of those little parking posts going around a corner/turn.


I suppose we can add the pond video to that list though it's clearly a case where the human should've done something and was given a big orange sign telling them to do so- so not sure we ought count that one... the other 1 (or 2 if you believe the self report with no official evidence guy) out of 400,000 people using it are cases where the human wouldn't have had sufficient warning/time/reason to intervene so I'd certainly blame that 1 (or possibly 2) on FSD. Which is a pretty crazy low rate of accidents.
The driver is ultimately responsible but the driver can also be distracted which is one of the main arguments for using FSD. If FSD isn't safer then the circular argument fails as it puts more burden and sense of urgency to respond for the potentially distracted driver. In the end, if it walks like a duck, talks like a duck, it's a duck.
 
Last edited:
In the end, if it walks like a duck, talks like a duck, it's a duck.
Right! And Duck Mode needs some work. Right now it behaves more like Squirrel Mode. Try this, yell......duck.......or......Siri duck.......if a remote command from your Apple Watch, and watch what happens.........I tried that at an underpass.........not pretty.........
 
The fact we've seen folks who routinely disengage in situations others do not-- and vice versa (and neither group appears to be getting into accidents) suggests that at least SOME of the things some people here report as "dangerous safety disengagement" really wasn't. Certainly some ARE though, but not as many as they appear to be reporting- they were just uncomfortably with the behavior even if it was objectively "safe" to not intervene in the situation so they choose to intervene and then it gets classed in their mind as a safety disengagement.
I'm always shocked when I read people who say their AP/FSD Beta bounces around in their lane and doesn't stay centered, or tries to make a turn where there is no road/street. I feel like these cars have a fundamental issue (either software corruption, or hardware failure) that must be addressed or they will never have a proper experience with the ADAS system.
 
  • Like
Reactions: Dennisis and DrGriz
The "debate"
I don’t think any of the FSDb versions could be considered ‘safe.’ They’re clearly labeled as betas, you have to click an extra agreement to use them and they disengage if you don’t pay attention. Not sure what the point of debate is here.
The debate is over what the word safe means. You might have a heart attack or a stroke and disengage when FSDb does something "unsafe" that I might just let play out. You might believe that no one in their right mind would let that maneuver proceed and I might just say not so bad, I can still correct if needed. My confidence driving in complexity might be very different from yours. I drove a taxi in New York City for much of the 70's and a little I almost hit/got hit by a car constantly just seems normal. When FSDb does something a bit off it scares the bejeebus out of my wife, me, I just say, not so bad, I got this.

I was driving our 4Runner yesterday and noted it was very unsafe. If I didn't turn the steering wheel constantly or apply the brakes like whenever, it would have driven right off the road. Talk about unsafe. We have, however, seen a real uptick in emergency of heart attacks and strokes attributed to FSDb. There are a lot of delicate hearts and brains out there..........