Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I am certainly not asking anyone for their opinion. I am documenting behavior that has been sporadically reported in the past. When did I ask anyone for their assistance?

This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.
Unfortunately you didn't have the visualization (and even if you did, it wouldn't show it AFAIK) but could it be that it predicted that the truck might cross your path (as turning into a driveway, than going in the path it ended up with)? That would also be another possible explanation.

Overall it seems Tesla has tuned their system (even with TACC/AP) to be very skittish with trucks (probably because of several well publicized crashes involving them, some fatal).

Also would be interesting to see if it applies any logic in terms of cars coming from behind. I proposed that a long while back, on the phantom braking. If they can put that as part of probability for deciding to brake, that might make things safer (if it knows no cars are following behind, the risk of rear collision from an unnecessary brake application is much lower or non-existent). Not sure if they already have this in AP or not.

Comparing the investigation population, of the 416k subject Tesla vehicles there were 354 complaints, but zero complaints that involved crashes, injuries or fatalities. For Honda's investigation of the 1.7 million, there were 278 complaints, but 6 that involved crashes or injuries. Scaling it down, you would expect 1-2 cases that involved crashes or injuries related to Tesla phantom braking from the same population. If you include all other Teslas (including the ones not under investigation) there should have been even more. Maybe Tesla is already using the rear cameras?
 
Comparing the investigation population, of the 416k subject Tesla vehicles there were 354 complaints, but zero complaints that involved crashes, injuries or fatalities. For Honda's investigation of the 1.7 million, there were 278 complaints

But like 2/3rds of the Tesla complaints came after a media blitz about phantom breaking, that included information about filing a complaint. I don't think there was any media reporting of the Honda issue until recently. So if they had the same reporting there would probably be a lot more complaints filed. (And people that had a crash are more likely to file a complaint.)

So I don't think you can glean any useful information to compare the two systems from the number of complaints.
 
Last edited:
I am certainly not asking anyone for their opinion. I am documenting behavior that has been sporadically reported in the past. When did I ask anyone for their assistance?

This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.

Curioiusly, it did end up going wide 'around' the mailbox. Doesn't seem likely that FSD anticipated that tho.

D3xT5PA.jpg
 
  • Like
Reactions: GlmnAlyAirCar
But like 2/3rds of the Tesla complaints came after a media blitz about phantom breaking, that included information about filing a complaint. I don't think there was any media reporting of the Honda issue until recently. So if they had the same reporting there would probably be a lot more complaints filed. (And people that had a crash are more likely to file a complaint.)

So I don't think you can glean any useful information to compare the two systems from the number of complaints.
I don't really care about the total number of complaints (for which the above things you mentioned plays a factor), but more about the ones that involved crashes or injuries. Honda had 6, Tesla had zero. Scaling the numbers you would expect Tesla to have 1-2. If a crash happened due to phantom braking, I think the likelihood it gets reported to NHTSA (and for Tesla, the media, given they are a media favorite) is much higher.
 
This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.
Its possible FSD thought the truck was coming into your lane - possibly to go around something - so wanted to give it more space, so went back. Then as the truck "returned" to its lane, FSD just drove forward.

As they say - monitor what FSD is doing and intervene when it does something wrong and report. Thats the point of testing.
 
There are supposedly 2,000 Tesla employees with FSD beta. It's not surprising that one of them didn't read the company's social media policy.
I'm sure all 100k employees have to go through training ... and specifically employees with FSD would have been emailed clearly not to post on social media.

I think its more a case of .... "they can't possibly find out who I am" than ignorance.
 
I know many of us liked the videos from AI Addict. Well, Tesla fired the employee who was the creator of the AI Addict youtube channel and removed his access to FSD Beta.

Bernal says before he was dismissed, managers verbally told him he “broke Tesla policy” and that his YouTube channel was a “conflict of interest.”
Losing FSD Beta access in his own car has curtailed his ability to create reviews of the system. However, he has attained access to other vehicles with FSD Beta enabled, and plans to continue his independent research and reviews.

 
I know many of us liked the videos from AI Addict. Well, Tesla fired the employee who was the creator of the AI Addict youtube channel and removed his access to FSD Beta.





When he hit the bollards, my first reaction was that he should have taken over way before that collision, and it was totally avoidable. To me, he wasn't using FSD beta safely. It created unnecessary negative press. Now, there are tons of videos showing FSD failing with drivers using the system appropriately. So FSD beta can justifiably create negative press for itself. But this one was unwarranted. And that it came from a self-declared Tesla employee, I can see why Tesla fired him.

He also said something dumb, like "This is the first time in half a year of using FSD that I hit something," as if to say he was a safe tester. (don't remember the exact quote; it's been a while). Well, that's not really a good track record. Ideally I don't want to be getting into collisions with anything at a half-year basis.
 
I'm sure all 100k employees have to go through training ... and specifically employees with FSD would have been emailed clearly not to post on social media.

I think its more a case of .... "they can't possibly find out who I am" than ignorance.

According to the article they were fully aware of his youtube channel, and fully aware of what he was doing with his channel.

His ignorance was thinking he could post something negative when employed by an Elon Musk company.

Personally I think we live in a fire happy culture. I would have just told him to stop posting videos due to a conflict of interest, and that would have been the end of it.

Firing someone just makes the whole issue blowup even more.
 
Tangentially related because I found the videos informative:
I didn't dig, but I always assumed anyone posting that said they were a Tesla employee at one point, would be a former employee, not a current one. I'm surprised there's a current employee that would be posting such videos on their own time. Seems like too much of a risk of insider information being leaked (even if not on purpose, it would bleed into any analysis you do). Would be a different case if Tesla had an "ambassador" type of employee (many companies do this), but Tesla does not.
 
When he hit the bollards, my first reaction was that he should have taken over way before that collision, and it was totally avoidable. To me, he wasn't using FSD beta safely. It created unnecessary negative press. Now, there are tons of videos showing FSD failing with drivers using the system appropriately. So FSD beta can justifiably create negative press for itself. But this one was unwarranted. And that it came from a self-declared Tesla employee, I can see why Tesla fired him.

He also said something dumb, like "This is the first time in half a year of using FSD that I hit something," as if to say he was a safe tester. (don't remember the exact quote; it's been a while). Well, that's not really a good track record. Ideally I don't want to be getting into collisions with anything at a half-year basis.
Ah, I never connected the dots. This makes a lot of sense if he was the same person that was the first to post a public crash with the system. I don't recall back then anyone in the media suggesting he was an employee though.