Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
I believe the YouTube comments referred to in that post are not the OPs comments, but the comments from one of the FSD you tubers.

View attachment 743275


The comments on the OP YouTube are documented here:
Post in thread 'FSD Beta Attempts to Kill Me; Causes Accident'
FSD Beta Attempts to Kill Me; Causes Accident
These comments (unrelated to this thread) are silly. “Car would not give control back” (nonsense of course). It was just beeping incessantly because TACC was still engaged, but it was under full manual steering control. Happens all the time (this ridiculous fall back to TACC has been commented on elsewhere). Still: there’s no need to be so ridiculously dramatic about the verbiage and AI Addict could just describe what had happened. He’s a Tesla engineer??? Unbelievable…or maybe it makes sense, haha.

Just a routine occurrence explaining nothing about the events at issue in this thread.
 
  • Love
Reactions: Silicon Desert
The OP accident has been caused by over-correction. Driving a Camry on TACC with lane keep assist I had to steer the car away from an imminent death trap multiple times within one week. If I rely on Toyota fully, I might over-correct, and then I wouldn't be writing this here, but instead I would be bashing Toyota for trying to kill me.
When you smoothly turn the wheel of your Toyota does it snap out of lane assist mode like autosteer does?
 
  • Like
Reactions: AlanSubie4Life
When you smoothly turn the wheel of your Toyota does it snap out of lane assist mode like autosteer does?
Yeah, curious. 2017 Highlander has weak and pathetic lane assist, but no jerking to override, but wonder on the newer ones. Smooth blending (without disengagement even) would be great on Teslas and would in some cases limit the overcorrection. I suspect lane keeping is very weak even on the new Toyotas, though. With smooth blending of course.

Not sure it would have helped in this case though. I’d be surprised if the driver had hands firmly at 9 and 3, as you must at all times when using FSD. If you are using no hands, one hand or non-optimal positioning, overcorrection seems more likely.
 
  • Like
Reactions: bhzmark and nvx1977
When you smoothly turn the wheel of your Toyota does it snap out of lane assist mode like autosteer does?
Sorta ... Toyota tries to resist your input, but it does so more linearly than Tesla. Knowing Tesla, I usually jerk the wheel quickly to get it disengage the AP instead of applying a constant force. From the video, it seems the driver was not ready to correct the AP (well, the FSD beta AP).
 
Yeah, curious. 2017 Highlander has weak and pathetic lane assist, but no jerking to override, but wonder on the newer ones. Smooth blending (without disengagement even) would be great on Teslas and would in some cases limit the overcorrection. I suspect lane keeping is very weak even on the new Toyotas, though. With smooth blending of course.

Not sure it would have helped in this case though. I’d be surprised if the driver had hands firmly at 9 and 3, as you must at all times when using FSD. If you are using no hands, one hand or non-optimal positioning, overcorrection seems more likely.
Toyota does resist the input noticeably, but it seems it yields to your input force instead of keeping the resistance constant until disengagement as in Teslas. I like Toyota's approach more than the approach of Tesla, but I can easily get used to both. The lane keep assist in Toyota is not just pathetic, it is dangerous. Several times on highway the car started to bounce between lines and after a couple bounces it moves into another lane or off road as if there were no lines at all. If you are not ready for that you'd be dead.
 
Toyota does resist the input noticeably, but it seems it yields to your input force instead of keeping the resistance constant until disengagement as in Teslas. I like Toyota's approach more than the approach of Tesla, but I can easily get used to both. The lane keep assist in Toyota is not just pathetic, it is dangerous. Several times on highway the car started to bounce between lines and after a couple bounces it moves into another lane or off road as if there were no lines at all. If you are not ready for that you'd be dead.
Are you referring to Lane Tracing Assist?

Haven't tried that but I've been driving a fleet 2021 Ford F150 since May and it has Lane Keeping, which I would never pay extra money for.
 
Are you referring to Lane Tracing Assist?

Haven't tried that but I've been driving a fleet 2021 Ford F150 since May and it has Lane Keeping, which I would never pay extra money for.
Thank you. I just checked, yes, it was called LTA, but I referred to it as the lane keep assist. Interestingly, it was in the base SE trim of Camry (a rental).
Edit: the car had this TACC+lane centering thingy but it had no side sensors to "see" if anything is around it. If something goes wrong (and it does go wrong pretty often), Toyota will happily ram into any unsuspected vehicle in the neighboring lane. Apparently, when "something goes wrong" Toyota's LTA system simply removes the line detection of one side letting the car to cross the line as nothing is happening.
 
Wonder if below is part of a fix.
From 10.6 release notes:
* ⁠New general static object network with 17% precision improvements in high curvature and nighttime cases.

Could be. the accident in question happened in late November, before 10.6. But I still think in this case, the car was swerving to avoid what it thought was a VRU but was instead a mailbox. So 10.6 might do the same thing. stay vigilant.
 
  • Like
Reactions: Terminator857
Would you like more examples, or does that cover it?
I watched a few and didn't see fsd abruptly **cross** a double yellow. It's a simple exercise to use the time stamp youtube link instead of posting a bunch of long videos with the knowledge that no one has the time to find the point that you think is relevant.

And the CNN guy is obviously being phony for the camera.
Odd choice to lead with that video which makes you also look phony
 
I watched a few and didn't see fsd abruptly **cross** a double yellow. It's a simple exercise to use the time stamp youtube link instead of posting a bunch of long videos with the knowledge that no one has the time to find the point that you think is relevant.

And the CNN guy is obviously being phony for the camera.
Odd choice to lead with that video which makes you also look phony

I mean, if you didn't see it driving on the wrong side or the road, and you didn't see it jerk the wheel into an oncoming car's path, then you're either blind or a liar. Because there are several examples I provided and if you "watched a few" then you would have seen it.

The CNN guy just so happened to have the exact same experience as many FSD testers, but somehow he's a phony? Even given the videos I've provided you have mostly come from big Tesla fans and they show the exact same thing happening? Ok. Sure.
 
I mean, if you didn't see it driving on the wrong side or the road, and you didn't see it jerk the wheel into an oncoming car's path, then you're either blind or a liar. Because there are several examples I provided and if you "watched a few" then you would have seen it.

The CNN guy just so happened to have the exact same experience as many FSD testers, but somehow he's a phony? Even given the videos I've provided you have mostly come from big Tesla fans and they show the exact same thing happening? Ok. Sure.
So instead of providing ANY link to specific time showing what you claim, you post word salad.
ignored. I don’t know how I didn’t ignore your nonsense long ago.
 
Last edited:
I watched a few and didn't see fsd abruptly **cross** a double yellow. It's a simple exercise to use the time stamp youtube link instead of posting a bunch of long videos with the knowledge that no one has the time to find the point that you think is relevant.

And the CNN guy is obviously being phony for the camera.
Odd choice to lead with that video which makes you also look phony
Oh that poor CNN guy... that video must be watched only along this one:
 
So instead of providing ANY link to specific time showing what you claim

Did you not see the timestamps in the videos I already posted? I've already done the thin you asked me to do, you're trying to smokescreen this so you can keep pretending it isn't happening

Oh that poor CNN guy... that video must be watched only along this one:

Ah yes, the owner begging Tesla not to kick him out of FSD, and his heavily clipped response video. Top notch work.

The problem with the fanbase is they can't accept obvious criticism, so instead of pushing Tesla to get better they just push an obviously wrong narrative. Meanwhile, Mercedes has been permitted to have highway L3 ADAS in Germany. Tesla's struggling to make L2 that won't crash into emergency vehicles, and Mercedes is going to be selling L3.

When someone dies "testing" FSD, it's going to catch the attention not only of US regulatory agencies but agencies worldwide. Good luck getting FSD in Canada or the EU anytime soon.
 
I've had this happen to me twice. Luckily I was able to correct it and not crash. On both occasions my GPS location was off-center on the car. I believe the car was trying to center itself back in the lane that it thought it was supposed to be in. Have you had any GPS location issues on your car? Tesla told me it was a known issue and there was nothing they could do to fix it in FSD beta , I would have to wait for an update. And for the record I think it's terrible that Tesla isn't working with you to fix whatever this issue was and at least acknowledge it, for the safety of all of their FSD beta drivers. The people that are talking about zero or few disengagements, have no idea of some of the other problems that people are having for whatever reasons, and can be condescending like you don't know how to use it or something. They literally have no clue. It's absolutely dangerous, and the people that have been in these situations know it.
 
  • Like
Reactions: BrerBear