Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
So what are you expecting exactly?

An ADAS that doesn't turn into the path of an oncoming car, probably as a base feature. Since this behavior has been getting worse over time rather than batter, that seems like far too much to ask.

could you have accidentally put too much pressure on the wheel to disengage FSD

As I just said to the other person above, this has been getting worse. We're seeing FSD turn into oncoming traffic at very close distances from nearly everybody posting beta videos. That kind of behavior in, say, a 50 MPH area would mean that the two cars would cover 150 ft of distance in 1 second. We are going to see a head-on collision if Tesla doesn't stop this from happening immediately, but instead it has been well over a year since 8.x started exhibiting this exact behavior.
 

Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.
Make sure you file this with NHSTA if you seriously think FSD beta is too dangerous to deploy to public.
 
  • Like
  • Disagree
Reactions: bhzmark and cwerdna
the other car's headlights were recognized because the high beams shut off shortly before the car swerved.
Right.
Looks like FSD overreacted to the mailbox, and the driver overcorrected. This is a situation warranted a phantom braking, rather than use the other side of the yellow line, even the mailbox was misidentified as a person.
Good eye. I'd bet a cup of coffee this is correct with what we've seen so far. Could be a classic case of "wrong place, wrong time" complicated by low beams, too much speed for conditions, not enough human oversight, etc.
The human reaction to those headlights bearing down on you is to brake and pull the car back into the lane
Agree - with all of the phantom braking going on this is curious in that it apparently did not happen. If FSD is faced with what it thinks is an imminent pedestrian collision, with a known oncoming vehicle, the obvious answer is to brake - not "avoid the person and hit the car instead". Maybe somewhere down that decision tree is "hit the car" but not here.
To mitigate the inherent FSD risk I do my best to keep positive pressure on the wheel away from the center or side line whether on a straight or curved road
And mostly this. The driver over-corrected in a split second unexpected scenario. I'm sure it could happen to many of us in that same scenario, including me. In retrospect though, as @Cerberus and others said, their experience would have greatly helped to mitigate this. Anyone who has used AP for any amount of time would anticipate getting close to that inside stripe, moreso than most people would. Those same AP users know to have a death grip on the wheel in sweeping turns that involve oncoming traffic.

Makes me wonder if the beta FSD beta criteria should include a year or more of AP time on non-interstate situations? I'm not a big Tesla fan, but I'd give 99% of the blame on the OP for inattentiveness and over-correction. Anyone with money but little experience can easily get FSD - just like they could buy a street bike that can do 175 MPH off the showroom floor. Maybe not the best choice on either case.

Again, this is given with what we know so far - lots of speculation.
 
In terms of the overcorrection, it seems like there are instances where the system can be difficult to disengage and can require real force that could lead to an overcorrection. I don't think that's what happened here, but look at what AI Addict had to do at the 3:00 mark in this video


Imagine dealing with that at higher speed -- this lends some believability to the report of the first FSD Beta accident
 
Last edited by a moderator:
  • Like
Reactions: DrDabbles
I don't think that's what happened here, but look at what AI Addict had to do at the 3:00 mark in this video.

AI Addict is also a terrible driver who over reacts much like the CNN reporter. The car never fights the steering wheel. In that video he wiggled it for some reason which isn’t going to break the auto steer. He should have just tapped the brake.
 
AI Addict is also a terrible driver who over reacts much like the CNN reporter. The car never fights the steering wheel. In that video he wiggled it for some reason which isn’t going to break the auto steer. He should have just tapped the brake.
Well that immediately brings up questions about communication and driver training/qualification, feels like there shouldn't be Beta testers who don't know what to do in a scenario like this. But if this were an emergency situation and the system was blasting "Take Over Immediately!" at you, I think the normal instinct would be to control the steering and disengaging should probably happen easily via just about any reflex action.

Lots of weirdness and potentially concerning stuff going on here
 
BTW, this thing could have easily happened with AP available to everyone - I don’t think this is AP specific.

Regular AP almost certainly wouldn’t have applied that much steering input. It (along with other lane-keep type systems) errs on the side of continuing on its chosen path. I bet it would have phantom braked and that’s it.

That’s what makes FSD Beta potentially more dangerous/unintuitive for experienced AP users. We assume the car won’t do something like steer over the double yellow into oncoming traffic based on our years of driving on regular autopilot.
 
there shouldn't be Beta testers who don't know what to do in a scenario like this.
Anyone who has used regular AP should know this. It’s quite surprising one of the older FSD Beta testers doesn’t. Could just be doing it to be dramatic though. Get more Tesla haters sharing the video that way.

Also, controlling the steering wheel is very easy, but wiggling it back and forth 2° each direction isn’t the way you do it lol
 
Wow, I just saw the video. This is nothing new with FSD beta. In fact, I used 10.6 an hour ago, and it tried to ram me into a double parked semi truck.

With the beta, whenever I sense ANY risk (especially an oncoming car on a 2 lane road), I am driving with my hands on the wheel. If FSD beta decides to deviate from my intended drive in any way, it's automatically disengaged.

It's unfortunate this happened, and it's not surprising honestly, but it's the nature of FSD beta: constant vigilance and hands on wheel.
Not sure if you are defending Tesla or not :)
 
In terms of the overcorrection, it seems like there are instances where the system can be difficult to disengage and can require real force that could lead to an overcorrection. I don't think that's what happened here, but look at what AI Addict had to do at the 3:00 mark in this video


Imagine dealing with that at higher speed -- this lends some believability to the report of the first FSD Beta accident
He explains in the youtube comments he was trying to get the warning to stop beeping and that he was doing the wheel shaking not the car. Would pressing the brake stop the warning? I don't know, one would assume the warning would have went off when the car detected him moving wheel. It is confusing to watch.
 
  • Like
Reactions: EVNow
I don't know whether this info is of any relevance but the other car's headlights were recognized because the high beams shut off shortly before the car swerved.
I would say it is not relevant if implying that FSD saw the oncoming car since they are different independent systems. I am not even sure the OP was in FSD at the time, but I will take his word for it. At the start of that video, the car is already on the center line and comes back into the lane. If I am on FSD in that situation and see that happening, I am going to disengage on that stretch of road. The great thing about that video is the OP walks away with no injuries.
 
He explains in the youtube comments he was trying to get the warning to stop beeping and that he was doing the wheel shaking not the car. Would pressing the brake stop the warning? I don't know, one would assume the warning would have went off when the car detected him moving wheel. It is confusing to watch.
True I just looked at the first crash report again and the person says the vehicle forcefully steered itself into the adjacent vehicle, which doesn't align with what happened here. But I could easily see someone making a bad move in a tense situation with this warning blaring at you to take over despite clearly having control and then misconstruing events after the fact.
Anyone who has used regular AP should know this. It’s quite surprising one of the older FSD Beta testers doesn’t. Could just be doing it to be dramatic though. Get more Tesla haters sharing the video that way.

Also, controlling the steering wheel is very easy, but wiggling it back and forth 2° each direction isn’t the way you do it lol
AI Addict isn't even just an older FSD Beta tester, he's a Tesla engineer who works on the FSD labelling team
 
I’d like to see evidence that AP/FSD was even engaged at time of the double yellow crossing so abruptly. I wonder if this isn’t a case of the driver thinking AP was engaged but it wasn’t or he accidently disengaged it just before or bumped the wheel. At worst I’ve only experienced AP taking turns a little wide and gradually entering, and sometimes crossing the double yellow. But I recall the crossing of the double yellow only occuring with AP1 and only on turns tighter than that. with AP2 I don’t recall crossing the dbl yellow to that degree and certainly not that abruptly.
 
Last edited:
I can see why that was taken down. The video pretty clearly shows that driver error was the cause of the accident. The oversteer reaction is really bad. Either OP wasn't really gripping the steering wheel, or they are an inexperienced driver. If you have been driving for any length of time, correcting from getting to close to or running over the median line is something you are familiar with and doesn't cause panic.

I suspect that OP was touching the steering wheel, but not grabbing it. This is poor driver behavior for AP, much less FSD. The takeaway here here should be: high alertness is required for the current state of FSD. If you aren't willing to do that, then don't participate. Instead, OP seems to have taken away: Any accident in my shiny new car is someone else's fault, and someone else needs to pay. This isn't how cars have ever worked.
 
I’d like to see evidence that AP/FSD was even engaged at time of the double yellow crossing so abruptly. I wonder if this isn’t a case of the driver thinking AP was engaged but it wasn’t or he accidently disengaged it just before or bumped the wheel. At worst I’ve only experienced AP taking turns a little wide and gradually entering, and sometimes crossing the double yellow. But I recall the crossing of the double yellow only occuring with AP1 and only on turns tighter than that. with AP2 I don’t recall crossing the dbl yellow to that degree and certainly not that abruptly.
If it's possible to accidentally disengage AP/FSD mid turn in a way that even very briefly sends you careening forward into an oncoming lane of traffic, it seems like a design flaw that should be addressed/mitigated. Situations like this would be why the NHTSA asks for reports including chunks of time before an accident to see if and when the system was disengaged, the sequence of events that led up to it, etc