Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The next big milestone for FSD is 11. It is a significant upgrade and fundamental changes to several parts of the FSD stack including totally new way to train the perception NN.

From AI day and Lex Fridman interview we have a good sense of what might be included.

- Object permanence both temporal and spatial
- Moving from “bag of points” to objects in NN
- Creating a 3D vector representation of the environment all in NN
- Planner optimization using NN / Monte Carlo Tree Search (MCTS)
- Change from processed images to “photon count” / raw image
- Change from single image perception to surround video
- Merging of city, highway and parking lot stacks a.k.a. Single Stack

Lex Fridman Interview of Elon. Starting with FSD related topics.


Here is a detailed explanation of Beta 11 in "layman's language" by James Douma, interview done after Lex Podcast.


Here is the AI Day explanation by in 4 parts.


screenshot-teslamotorsclub.com-2022.01.26-21_30_17.png


Here is a useful blog post asking a few questions to Tesla about AI day. The useful part comes in comparison of Tesla's methods with Waymo and others (detailed papers linked).

 
Last edited:
For a very long time I hung on to Elon’s every prognostication about automated driving. This goes all the way back to the 2017.26? branch that was SILKY SMOOTH! (it wasn’t).

FSD clearly is a ruse in that they advertised and sold a product with vivid emphasis and repetitive bolstering for six years and it still hasn’t arrived. Today we’re at HW3 + “at least six months” and I don’t know what else to call that beyond a ruse. They could have very easily tested FSDb in exactly the same manner they do now, for free, and sold the product when it was done.

Much like the hyper loop or the boring company (remember bolting extra wheels onto the side of a car with Jay Leno?), FSD is a technical achievement pretending to solve a problem. That’s neat provided your customers haven’t devoted their resources towards your promise. Maybe gluing extra wheels to the side of a Chevy Bolt and stuffing it into a hole will be common some day, but there’s a reason they’re not selling Boring Bolt Hole Passes right now.

I’m of the mind that, like Twitter, Elon’s enthusiasm got the best of him at the worst time and he simply didn’t understand the first principles of the problem. But then, like Twitter, he was too far in and became mired in his own nonsense.

I also believe that he just doesn’t drive very much, maybe less than a thousand miles a year. And when he does drive it’s on bespoke software that is truly a technical revelation in some respects while being so far from complete that the technical breakthroughs give him a false sense of a timeline. He sees something remarkable, extrapolates, moves on to something else, and ends up at a “false horizon.”

I don’t know what else to call that but a charlatan selling a ruse. I don’t believe he set out in that manner but after the first five or six horizons he clearly knows better. And I fell for it all, hook, line and sinker.
You call it correctly, and accurately.
 
  • Like
Reactions: mtndrew1
I suspect that Elon has been practicing “fake it till you make it” for so long, that he doesn’t actually understand how to stop faking it… or perhaps he thinks that he can always fake it. One need look no further than Elizabeth Holmes to see how dangerous it is to fake it a bit too much. This said, America needs Elon Musk, and he will never (I predict) be prosecuted for faking it (including lying and misleading investors the way Elizabeth did). The man is as untouchable as a human being can be, and the only way to get him to change will be to hit him in his pocketbook by not making the same mistake twice (or more).

Joe
There's a huge difference between Elon Musk and Elizabeth Holmes. Holmes never had any actual working technology; everything was a fraud. Elon and Tesla have produced quite a bit. I'm not trying to argue that he has been anything close to accurate with his statements, simply pointing out that the two are not at all in the same category.

I suggest you listen to the "Bad Blood" podcast or read the book by John Carreyrou. It's incredibly interesting and shows the length Holmes and Sunny Balwani went to cover up their fraud.
 
Tried 11.4.7 at my 4 way stop and today it did acceptably well. The creep was much less and it made the turn with no hesitation. A big improvement over the more recent versions. I was happy about that small improvement.
Every once in a while things randomly look promising but it's unfortunately short lived.
 
Well, that’s not good…

“Tesla FSD 11.4.7 fails to stop for a father and his son while crossing the street in a school zone. I had to disengage FSD as it was clear FSD was not going to stop for them.” Video of the event below.

The correct thing to do is to stop. But just because it's correct doesn't mean it's safe.

In my area, many pedestrians will cross like this with the intention of crossing right behind your vehicle. Slowing down abruptly not only confuses the pedestrian, but can also lead to vehicles behind you rear-ending your car.

It's hard to tell from the video, but my intuition is saying these pedestrians were planning on passing just behind the Tesla.
 
The correct thing to do is to stop. But just because it's correct doesn't mean it's safe.

In my area, many pedestrians will cross like this with the intention of crossing right behind your vehicle. Slowing down abruptly not only confuses the pedestrian, but can also lead to vehicles behind you rear-ending your car.

It's hard to tell from the video, but my intuition is saying these pedestrians were planning on passing just behind the Tesla.
I agree, that wasn't a crosswalk. They were jaywalking. The dad waved when the Tesla stopped, knowing that they really didn't have the right of way. He was using his son as a shield and took a chance, but probably would have stopped and waited if the car had not.

But I would also have stopped, because, who knows what's going through the mind of someone teaching their kid to jaywalk?
 
The correct thing to do is to stop. But just because it's correct doesn't mean it's safe.

In my area, many pedestrians will cross like this with the intention of crossing right behind your vehicle. Slowing down abruptly not only confuses the pedestrian, but can also lead to vehicles behind you rear-ending your car.

It's hard to tell from the video, but my intuition is saying these pedestrians were planning on passing just behind the Tesla.

I could understand the adult assertively crossing if it was a school zone as described. Unfortunately FSD is context numb including school zones.
 
I agree, that wasn't a crosswalk. They were jaywalking. The dad waved when the Tesla stopped, knowing that they really didn't have the right of way. He was using his son as a shield and took a chance, but probably would have stopped and waited if the car had not.

But I would also have stopped, because, who knows what's going through the mind of someone teaching their kid to jaywalk?
It's a crazy dichotomy when FSD brakes hard for pedestrians walking on sidewalks parallel to the roadway versus refusing to stop or acknowledge pedestrians crossing the roadway. Something has to give with that logic.
 
I have a 2018 MX with full self driving and MCU upgrade. I got 2023.7.30 a week or so ago so I tried it out driving from Livermore to Pleasanton (California) on westbound Stanley Boulevard. There is a right turn that the car has to take to get to our home and Stanley Boulevard turns there as well. As we approached the right turn, the car had to pass through an empty green bike lane and into the right hand turn lane. Instead, the car attempted to change lanes to the right but started an abrupt shuddering and shaking back and forth to the point that my girlfriend shouted at me "what are you doing?" I quickly took over and stopped the shuddering and brought the car into the proper lane for the turn. Needless to say, FSD is not ready for prime time yet.

It does well driving straight on the freeways, though.
 
  • Like
Reactions: HitchHiker71
Every once in a while things randomly look promising but it's unfortunately short lived.
Agree to some extent but when I look back at the first drives I did in October 2021 the improvement in FSD compared to how FSD drives today is significant. It just needs a lot more improvement to reach Elon's objective. I will continue to say give me L3 on the highways, then on city streets and I along with most people would be happy. Won't happen though.
 
Agree to some extent but when I look back at the first drives I did in October 2021 the improvement in FSD compared to how FSD drives today is significant. It just needs a lot more improvement to reach Elon's objective. I will continue to say give me L3 on the highways, then on city streets and I along with most people would be happy. Won't happen though.
Yep. I tend to compare how far away things are today with how stubbornly slow things progress, the inability to fix bugs, and minimize regressions.
 
For a very long time I hung on to Elon’s every prognostication about automated driving. This goes all the way back to the 2017.26? branch that was SILKY SMOOTH! (it wasn’t).

FSD clearly is a ruse in that they advertised and sold a product with vivid emphasis and repetitive bolstering for six years and it still hasn’t arrived. Today we’re at HW3 + “at least six months” and I don’t know what else to call that beyond a ruse. They could have very easily tested FSDb in exactly the same manner they do now, for free, and sold the product when it was done.

Much like the hyper loop or the boring company (remember bolting extra wheels onto the side of a car with Jay Leno?), FSD is a technical achievement pretending to solve a problem. That’s neat provided your customers haven’t devoted their resources towards your promise. Maybe gluing extra wheels to the side of a Chevy Bolt and stuffing it into a hole will be common some day, but there’s a reason they’re not selling Boring Bolt Hole Passes right now.

I’m of the mind that, like Twitter, Elon’s enthusiasm got the best of him at the worst time and he simply didn’t understand the first principles of the problem. But then, like Twitter, he was too far in and became mired in his own nonsense.

I also believe that he just doesn’t drive very much, maybe less than a thousand miles a year. And when he does drive it’s on bespoke software that is truly a technical revelation in some respects while being so far from complete that the technical breakthroughs give him a false sense of a timeline. He sees something remarkable, extrapolates, moves on to something else, and ends up at a “false horizon.”

I don’t know what else to call that but a charlatan selling a ruse. I don’t believe he set out in that manner but after the first five or six horizons he clearly knows better. And I fell for it all, hook, line and sinker.

After canceling my FSD subscription now 2 1/2 months ago and using the included AutoPilot only, I wouldn’t use FSD Beta if it was free to me in its current state. My daily commute has been immensely better using autopilot than FSD Beta ever was. One of those “step away and let the cloud clear away” kind of moments.