GlmnAlyAirCar
Active Member
To make the video easier to view, I sent the masters out to the production department (a.k.a., my teenage son) to combine all four camera outputs into a single view.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Unfortunately you didn't have the visualization (and even if you did, it wouldn't show it AFAIK) but could it be that it predicted that the truck might cross your path (as turning into a driveway, than going in the path it ended up with)? That would also be another possible explanation.I am certainly not asking anyone for their opinion. I am documenting behavior that has been sporadically reported in the past. When did I ask anyone for their assistance?
This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.
Comparing the investigation population, of the 416k subject Tesla vehicles there were 354 complaints, but zero complaints that involved crashes, injuries or fatalities. For Honda's investigation of the 1.7 million, there were 278 complaints
I am certainly not asking anyone for their opinion. I am documenting behavior that has been sporadically reported in the past. When did I ask anyone for their assistance?
This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.
I don't really care about the total number of complaints (for which the above things you mentioned plays a factor), but more about the ones that involved crashes or injuries. Honda had 6, Tesla had zero. Scaling the numbers you would expect Tesla to have 1-2. If a crash happened due to phantom braking, I think the likelihood it gets reported to NHTSA (and for Tesla, the media, given they are a media favorite) is much higher.But like 2/3rds of the Tesla complaints came after a media blitz about phantom breaking, that included information about filing a complaint. I don't think there was any media reporting of the Honda issue until recently. So if they had the same reporting there would probably be a lot more complaints filed. (And people that had a crash are more likely to file a complaint.)
So I don't think you can glean any useful information to compare the two systems from the number of complaints.
Its possible FSD thought the truck was coming into your lane - possibly to go around something - so wanted to give it more space, so went back. Then as the truck "returned" to its lane, FSD just drove forward.This also might be an example of when FSD attempted (erroneously) to evade an accident. If FSD truly believed the truck was in my lane, and the only way to avoid a collision was to back away from it, this is perhaps the first documented case. That's the only reason I speculate it stopped and backed. And, as I said in my original post, there was no danger of collision. But if FSD believed there was (again, erroneously), this was an evasive maneuver.
Who takes reports or complaints about broken phantoms anyway? Is there like a BCB, Better Casper Bureau or something?But like 2/3rds of the Tesla complaints came after a media blitz about phantom breaking, that included information about filing a complaint.
All companies have policies around social media posting .... no idea why someone would so blatantly flout them.Tangentially related because I found the videos informative:
Tesla fired an employee after he posted driverless tech reviews on YouTube
John Bernal was a Tesla employee who showed FSD Beta to the world on his YouTube channel, AI Addict. He was fired in February.www.cnbc.com
There are supposedly 2,000 Tesla employees with FSD beta. It's not surprising that one of them didn't read the company's social media policy.All companies have policies around social media posting .... no idea why someone would so blatantly flout them.
I'm sure all 100k employees have to go through training ... and specifically employees with FSD would have been emailed clearly not to post on social media.There are supposedly 2,000 Tesla employees with FSD beta. It's not surprising that one of them didn't read the company's social media policy.
Bernal says before he was dismissed, managers verbally told him he “broke Tesla policy” and that his YouTube channel was a “conflict of interest.”
Losing FSD Beta access in his own car has curtailed his ability to create reviews of the system. However, he has attained access to other vehicles with FSD Beta enabled, and plans to continue his independent research and reviews.
I know many of us liked the videos from AI Addict. Well, Tesla fired the employee who was the creator of the AI Addict youtube channel and removed his access to FSD Beta.
Tesla fired an employee after he posted driverless tech reviews on YouTube
John Bernal was a Tesla employee who showed FSD Beta to the world on his YouTube channel, AI Addict. He was fired in February.www.cnbc.com
I'm sure all 100k employees have to go through training ... and specifically employees with FSD would have been emailed clearly not to post on social media.
I think its more a case of .... "they can't possibly find out who I am" than ignorance.
Agree, upvotes for AI Addict.Firing someone just makes the whole issue blowup even more.
I didn't dig, but I always assumed anyone posting that said they were a Tesla employee at one point, would be a former employee, not a current one. I'm surprised there's a current employee that would be posting such videos on their own time. Seems like too much of a risk of insider information being leaked (even if not on purpose, it would bleed into any analysis you do). Would be a different case if Tesla had an "ambassador" type of employee (many companies do this), but Tesla does not.Tangentially related because I found the videos informative:
Tesla fired an employee after he posted driverless tech reviews on YouTube
John Bernal was a Tesla employee who showed FSD Beta to the world on his YouTube channel, AI Addict. He was fired in February.www.cnbc.com
Ah, I never connected the dots. This makes a lot of sense if he was the same person that was the first to post a public crash with the system. I don't recall back then anyone in the media suggesting he was an employee though.When he hit the bollards, my first reaction was that he should have taken over way before that collision, and it was totally avoidable. To me, he wasn't using FSD beta safely. It created unnecessary negative press. Now, there are tons of videos showing FSD failing with drivers using the system appropriately. So FSD beta can justifiably create negative press for itself. But this one was unwarranted. And that it came from a self-declared Tesla employee, I can see why Tesla fired him.
He also said something dumb, like "This is the first time in half a year of using FSD that I hit something," as if to say he was a safe tester. (don't remember the exact quote; it's been a while). Well, that's not really a good track record. Ideally I don't want to be getting into collisions with anything at a half-year basis.