Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
op removed the video so no late comers can chime in. op has also not responded but has been back on the forum since the post - now the discussion is somewhat off topic and a free for all.. first post of ALL time and its a hot button topic (fsd).

giphy.gif


did the op post some in-cabin video showing his or her hands on the steering wheel or did they only show the cams from the car? if the latter, no one on here knows what the car did or did not do since you cant see what the driver was doing.
 
  • Like
Reactions: nvx1977
As to the AI Driver clip, at 3:03 Auto-steering (blue wheel) is already OFF. He was just wiggling the wheel hoping to stop the annoying alert. I've had the red wheel alert stay up for well over a minute once, maybe two. Got stuck. Even as I was back to driving.

One thought. Anybody using ANY form of automation on a Tesla MUST develop the reflex to disengage everything by tapping up on the right stalk immediately upon sign of trouble. While braking. While swerving. While cursing. Bam, like that. Save your butt. Save your pride.
 
Last edited:
op removed the video so no late comers can chime in. op has also not responded but has been back on the forum since the post - now the discussion is somewhat off topic and a free for all.. first post of ALL time and its a hot button topic (fsd).

giphy.gif


did the op post some in-cabin video showing his or her hands on the steering wheel or did they only show the cams from the car? if the latter, no one on here knows what the car did or did not do since you cant see what the driver was doing.
There's a comment in the Jalopnik article claiming the person who posted the video had other videos of him using FSD Beta without his hands on it:
"This guy has other videos of him simply not having his hands on the wheel, which is entirely against the point of the beta. Had he simply held the wheel enough to let the curve continue, it would have disengaged and continued."
Tesla Full Self-Driving Beta Causes Accident With Model Y

Does anyone know what is the youtube channel of the OP (assuming he did not delete the whole channel)?

Edit: Nevermind, the channel has all video deleted (as expected, especially if OP's attorney is involved as people are speculating):
https://www.youtube.com/channel/UCNmzLmgDajySek7ztCHhz-w/videos

I found it by doing some snooping on internet archive, which actually has the full video still archived (sorry, can't just do regular link, the forum software refuses to let me just post the link, it keeps detecting it as the dead youtube link, you would have to copy and paste on your own):

Code:
https://web.archive.org/web/20211210213114/https://www.youtube.com/watch?v=7VhrG-7SBZg
 
Victim blaming much in this thread?
No way. You opt in to testing beta, and are supposed to have both hands on the wheel. I don’t know what is so hard to understand about “the car may do the worst possible thing and the worst possible time” it’s right on the instructions.

That said, I’m fine with holding Elon and Tesla accountable for what they say, so if anyone wants to file a lawsuit against them for false advertisement, I’ll wait.

Btw, the reason there hasn’t been a class action is because attorney get paid by fees from what they win (usually 33 percent or more). I’m guessing the attornies that were presented with this opportunity by potential clients went “meh”….
 
  • Disagree
Reactions: FlatSix911
We have quite a lot of evidence that FSD suddenly can't stay in lane or also steer towards oncoming traffic.
Another post that ignores the request to post links to good evidence. If the evidence is abundant why not just post it?

Please back up your claim with "quite a lot of evidence" of FSD steering into oncoming traffic.
 
Another post that ignores the request to post links to good evidence. If the evidence is abundant why not just post it?

Please back up your claim with "quite a lot of evidence" of FSD steering into oncoming traffic.

All in this thread. Sadly, it is not worth discussing really.
If Tesla had the confidence in their system that Mercedes has, why don't they release a similar Level 3 system for Europe asap?
It is Tesla&fans that need to back up their extraordinary claims with extraordinary evidence.
 
Hilarious. Still unable to back up claim of “quite a lot of evidence”.

The OP video is of course at issue because we don’t know that FSD was engaged. And I already pointed out that I reviewed the few minutes of some of the other videos and didn’t see any evidence and I’m not going to watch a 20 minute video looking for the 5 seconds that supposedly supports the claim “quite a lot of evidence”

Presumably you have already done so and so can easily provide time linked youtube links for the multiple videos, or whatever other evidence you think, supports the claim of “quite a lot of evidence.”

Very simple: stop posting insults, empty invective, and attempted burden-shifting that only undermines your credibility and instead post facts and evidence.








All in this thread. Sadly, it is not worth discussing really.
If Tesla had the confidence in their system that Mercedes has, why don't they release a similar Level 3 system for Europe asap?
It is Tesla&fans that need to back up their extraordinary claims with extraordinary evidence.
 
Hilarious. Still unable to back up claim of “quite a lot of evidence”.

The OP video is of course at issue because we don’t know that FSD was engaged. And I already pointed out that I reviewed the few minutes of some of the other videos and didn’t see any evidence and I’m not going to watch a 20 minute video looking for the 5 seconds that supposedly supports the claim “quite a lot of evidence”

Presumably you have already done so and so can easily provide time linked youtube links for the multiple videos, or whatever other evidence you think, supports the claim of “quite a lot of evidence.”

Very simple: stop posting insults, empty invective, and attempted burden-shifting that only undermines your credibility and instead post facts and evidence.
This one is only 22 seconds.
 
Hilarious. Still unable to back up claim of “quite a lot of evidence”.

The OP video is of course at issue because we don’t know that FSD was engaged. And I already pointed out that I reviewed the few minutes of some of the other videos and didn’t see any evidence and I’m not going to watch a 20 minute video looking for the 5 seconds that supposedly supports the claim “quite a lot of evidence”

Presumably you have already done so and so can easily provide time linked youtube links for the multiple videos, or whatever other evidence you think, supports the claim of “quite a lot of evidence.”

Very simple: stop posting insults, empty invective, and attempted burden-shifting that only undermines your credibility and instead post facts and evidence.
Two barriers to OP’s success.
1) There is no evidence that is going to work other than black box data to convince anyone here, or a jury, that FSD was engaged. The video is meaningless. Someone with a totaled Tesla has motivation to recoup by blaming the ever-suspicious FSD.
Standard autopilot could have done the same thing he claims BTW.
2) Even if he somehow obtains this data and it shows what he says happened is true, Tesla is very likely to be fully protected due to 1) bad driving skills constituting contributory negligence, 2) a potential claim of bad judgement for running FSD in the wrong conditions and 3) all the waiver stuff that comes along with FSD.
 
  • Helpful
Reactions: lUtriaNt
Hilarious. Still unable to back up claim of “quite a lot of evidence”.

The OP video is of course at issue because we don’t know that FSD was engaged. And I already pointed out that I reviewed the few minutes of some of the other videos and didn’t see any evidence and I’m not going to watch a 20 minute video looking for the 5 seconds that supposedly supports the claim “quite a lot of evidence”

Presumably you have already done so and so can easily provide time linked youtube links for the multiple videos, or whatever other evidence you think, supports the claim of “quite a lot of evidence.”

Very simple: stop posting insults, empty invective, and attempted burden-shifting that only undermines your credibility and instead post facts and evidence.
It is linked in my previous post. Are you blind or di you just want to be difficult?

I ask you to document that FSD is really a safe and trustworthy system, that never crash. Tesla should be able to help you. Start with asking for a disengagement report/mile for the beta.

Please stop lying about the supreme capabilities of FSD beta if one can not show the evidence for this!
 
Hilarious. Still unable to back up claim of “quite a lot of evidence”.

The OP video is of course at issue because we don’t know that FSD was engaged. And I already pointed out that I reviewed the few minutes of some of the other videos and didn’t see any evidence and I’m not going to watch a 20 minute video looking for the 5 seconds that supposedly supports the claim “quite a lot of evidence”

Presumably you have already done so and so can easily provide time linked youtube links for the multiple videos, or whatever other evidence you think, supports the claim of “quite a lot of evidence.”

Very simple: stop posting insults, empty invective, and attempted burden-shifting that only undermines your credibility and instead post facts and evidence.
And another thing: stop believing what the corporation is telling you!
 
  • Funny
Reactions: alexgr and bhzmark
The OP video is of course at issue because we don’t know that FSD was engaged.
I certainly understand your desire for actual evidence, and it's unfortunate there isn't cabin footage of this incident. Probably eventually we'll have some from some other incident.

But putting that aside, remember that Tesla tells us that this is how FSD behaves, so on the face of it, I don't have any issue believing that FSD was engaged in this case (also all the behaviors in the video are consistent with FSD being engaged until the intervention/overcorrection). I also have little doubt that the driver's hands were not at 9 & 3. In general, it's hard (though still possible if you're really jumpy - but then maybe don't use FSD Beta?) to overcorrect in a situation like this with both hands on the wheel in the optimal position, with slightly mismatched torques to satisfy the sensor, as they must be at all times when the system is engaged and the vehicle is in motion.

We'll likely never know, but I'm content with assuming in this case that FSD was engaged, and this was driver error in response to a typical error, as described by Tesla, by FSD. I'm not broken up about it, but it shows the importance of taking the task of using FSD Beta seriously. I think you said earlier you don't have FSD? I assure you this behavior is normal and expected, as Tesla says. And this incident shows that that behavior can be extremely dangerous when the driver is not 100% in control of the vehicle while FSD is engaged.

I'd like Tesla to get even more strict about kicking people out of the program with their driver monitoring. It's actually somewhat decent right now - they really do detect if you're not looking straight ahead for a period of time - but I'd love for people getting kicked out (for a long period of time) to become more routine. It's not a major loss for the owner at all to lose FSD Beta access, so I feel like Tesla could fine tune the system and weed out more people. There likely won't be any backlash - it just makes it a better game since that's how Tesla has set things up so far - and it will help improve safety (which is a major concern when the system has limitations like this and drivers are not sufficiently trained).
 
Last edited:
It is linked in my previous post. Are you blind or di you just want to be difficult?

I ask you to document that FSD is really a safe and trustworthy system, that never crash. Tesla should be able to help you. Start with asking for a disengagement report/mile for the beta.

Please stop lying about the supreme capabilities of FSD beta if one can not show the evidence for this!
You have a bunch of posts. Just link to the **multiple** “quite a lot” of links that support the claim. And again, try to remove the insults from the posts. They are a poor substitute for actually providing the links.
 
And another thing: stop believing what the corporation is telling you!
I disagree. @bhzmark 's issue is that he is not believing what the corporation is telling him.
The FSD Beta disclaimer clearly states that "It may do the wrong thing at the worst time." It's hard to imagine a better example of this than turning towards oncoming traffic.