Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
might as well chime in.

I've had FSD beta since 10.1 and drive on a somewhat similar road to this daily. It is a little more populated and a speed limit of 40, but still a 2 lane road with some curves on it.

Since day one(even before when just using AP), my car can sometimes drift on the yellow line and does it quite a bit further left than I would ever drive as many on here have pointed out.

But I can also say that anytime I do any curve (especially with an oncoming car), I have both my hands directly on the wheel and don't let the car drift. There are quite a few times where I "take over" without meaning to because I just can't let the car get that close, its not worth it.

Now if it is true and the car jerked into the oncoming lane, it messed up bad. Driver also messed up bad by steering directly off the road. In no way was this all Tesla's fault. Anyone who has used this FSD feature knows it messes up all the time and you have to be able to know how to correct behavior properly.
 
I think we can expect more action to be taken on the driver monitoring side with incidents like this occurring, they'll continue refining the monitoring systems and will likely ramp up the consequences.

Shouldn't be news to anyone that FSD Beta can do the wrong thing at the worst time, hence why you need to be ready to take over at all times. The fault here would lie in systems that are insufficient in ensuring drivers are appropriately engaged, or properly trained for these situations, or lacking in other ways this stuff can be mitigated.
 
  • Like
Reactions: Terminator857
I certainly understand your desire for actual evidence, and it's unfortunate there isn't cabin footage of this incident. Probably eventually we'll have some from some other incident.

But putting that aside, remember that Tesla tells us that this is how FSD behaves, so on the face of it, I don't have any issue believing that FSD was engaged in this case (also all the behaviors in the video are consistent with FSD being engaged until the intervention/overcorrection). I also have little doubt that the driver's hands were not at 9 & 3. In general, it's hard (though still possible if you're really jumpy - but then maybe don't use FSD Beta?) to overcorrect in a situation like this with both hands on the wheel in the optimal position, with slightly mismatched torques to satisfy the sensor, as they must be at all times when the system is engaged and the vehicle is in motion.

We'll likely never know, but I'm content with assuming in this case that FSD was engaged, and this was driver error in response to a typical error, as described by Tesla, by FSD. I'm not broken up about it, but it shows the importance of taking the task of using FSD Beta seriously. I think you said earlier you don't have FSD? I assure you this behavior is normal and expected, as Tesla says. And this shows that behavior can be extremely dangerous when the driver is not 100% in control of the vehicle while FSD is engaged.

I'd like Tesla to get even more strict about kicking people out of the program with their driver monitoring. It's actually somewhat decent right now - they really do detect if you're not looking straight ahead for a period of time - but I'd love for people getting kicked out (for a long period of time) to become more routine. It's not a major loss for the owner at all to lose FSD Beta access, so I feel like Tesla could fine tune the system and weed out more people. There likely won't be any backlash - it just makes it a better game since that's how Tesla has set things up so far - and it will help improve safety (which is a major concern when the system has limitations like this and drivers are not sufficiently trained).

I have one car with FSD,but not the Beta which is what you are likely referring to.
I get that FSD Beta may have a higher authority to cross left into oncoming lanes to go around or give clearance to objects on the right side. But it seems odd that it would do so without any indication of something on the right to give wider berth to and especially with oncoming traffic that should be easy to detect.

I’m just looking for solid evidence that this happens, especially if it happens “quite a lot” when there is also clear indication of oncoming traffic.

I recall videos of other FSD Beta usage showing the car in more urban/suburban environments wait for oncoming traffic and then go around something on the right crossing into the left oncoming lane as we all typically would do.

I won’t be too surprised to find that this might happen occasionally, especially when there is something on the right to go around, but I’m just looking for evidence of “quite a lot” of this into oncoming traffic.
 
Interesting. I'm going to hold judgement for now.

I remember all of those unintended acceleration cases years ago. Drivers/owners were totally convinced that it was their Tesla trying to kill them. Saavy technical people looked into them and ~ all of them were driver error. Those same drivers/owners didn't pursue and release info to the public. Never heard any retractions, either.

Hint: Those same people that looked into unintended acceleration can look into this case and extract the logs. Just make sure you ask for $$$ cash up-front and sell your Tesla's logs as-is, no conditions, no refunds, etc.
 
might as well chime in.

I've had FSD beta since 10.1 and drive on a somewhat similar road to this daily. It is a little more populated and a speed limit of 40, but still a 2 lane road with some curves on it.

Since day one(even before when just using AP), my car can sometimes drift on the yellow line and does it quite a bit further left than I would ever drive as many on here have pointed out.

But I can also say that anytime I do any curve (especially with an oncoming car), I have both my hands directly on the wheel and don't let the car drift. There are quite a few times where I "take over" without meaning to because I just can't let the car get that close, its not worth it.

Now if it is true and the car jerked into the oncoming lane, it messed up bad. Driver also messed up bad by steering directly off the road. In no way was this all Tesla's fault. Anyone who has used this FSD feature knows it messes up all the time and you have to be able to know how to correct behavior properly.
Right. I have seen the more gradual drifting wide on a curve since AP1, which was greatly reduced in AP2. The jerking left is the unusual behavior that makes me wonder if that was really FSD doing that.
 
but it seems odd that it would do so without any indication of something on the right to give wider berth to
There was a mailbox there which as discussed earlier could have been a factor here. We'll likely never know. 10.4 had a lot of false pedestrian detections, and the existing build probably still has them, though this particular one is fixed:


I’m just looking for evidence of “quite a lot” of this into oncoming traffic.

I can't provide that evidence for you into oncoming traffic - I haven't had that happen to me but I'm confident that it could if I used it more. But 10.4 had a lot of odd steering behavior (current build probably does too!) - and no, these are not examples of double yellow line crossing, they just show the limitations. Here's some interesting steering behavior I've observed. Note in these videos I'm not obeying my 9 & 3 rule (sad!) but I do have a firm grip on the wheel with my arm in a high leverage position, so corrections are smooth - it also helps to always expect it to do the wrong thing - I'm always amazed when it completes a turn correctly and I don't have to disengage, to be honest!


This thread, by the way, demonstrates that the vehicle will follow the projected path (see the third picture in the first link above). If for some reason the path planner decided that it needed to deviate into the opposing lane, it will. This can happen, and while I haven't personally seen it occur into oncoming traffic, I have no doubt that it will, and believe the videos showing that. Certainly it will cross double yellows for no good reason, so if it actually has a good reason ("pedestrian" right next to the roadway), it will do it as well.

If there were cabin footage, I'm fairly sure we'd see the path planner suddenly deviate into opposing traffic, and the vehicle follow.
 
Last edited:
I certainly understand your desire for actual evidence, and it's unfortunate there isn't cabin footage of this incident. Probably eventually we'll have some from some other incident.

But putting that aside, remember that Tesla tells us that this is how FSD behaves, so on the face of it, I don't have any issue believing that FSD was engaged in this case (also all the behaviors in the video are consistent with FSD being engaged until the intervention/overcorrection). I also have little doubt that the driver's hands were not at 9 & 3. In general, it's hard (though still possible if you're really jumpy - but then maybe don't use FSD Beta?) to overcorrect in a situation like this with both hands on the wheel in the optimal position, with slightly mismatched torques to satisfy the sensor, as they must be at all times when the system is engaged and the vehicle is in motion.
If there were previous video that this person posted that showed how he used FSD Beta (as suggest by the Jalopnik post linked in my earlier comment) we may have an answer.

But in absence of that, I agree if the OP had both hands on wheel, the likelihood of an overcorrection of that magnitude is quite low (from my experience with AP). You really don't even have to apply so much torque as to kick yourself out of AP/FSD, only need to provide enough to resist the movement of the wheel, which isn't hard with both hands on wheel (but much harder with only slight touch with one hand or immediately grabbing wheel that had no hands on it). There's also using the stalk or tapping on the brake to kick it out.

I'm guessing Tesla would also be treating it the same as if car is on AP and it drifted into the other lane, which from discussion above has happened in other cases (just that they were largely non-events due to no overcorrection). As for the crash, it's all from the unnecessary overcorrection.
 
but I’m just looking for evidence of “quite a lot” of this into oncoming traffic.

Understood. I don't care for the "quite a lot" threshold established here, though. One time is all it takes - it doesn't have to be common! I'm not interested in debating how common things are, since that's basically subjective with the information we have at our disposal. The question is whether it can happen - and while it has not happened to me, I'm quite confident that it could, based on my observations of the behavior. I would describe this specific behavior as not common, but way too frequent.

If you poke through the videos, I think you'll find ample evidence of highly questionable behavior of FSD Beta. Will you reach the conclusion that this behavior in the video could have occurred with FSD engaged? I have no idea - it depends on your biases and your personal threshold for evidence. But personally I think the balance of the evidence in the public domain suggests that it is perfectly plausible that this is an error by FSD Beta.

And this is exactly how any user of FSD Beta should expect it to behave. That's just the state of things, whether we like it or not. Stay alert, and especially if there is little margin for error, definitely keep both hands on the wheel - the vehicle can become an extreme hazard to other road users at any time, as Tesla says.
 
Last edited:
  • Like
Reactions: bhzmark
It is linked in my previous post. Are you blind or di you just want to be difficult?

I ask you to document that FSD is really a safe and trustworthy system, that never crash. Tesla should be able to help you. Start with asking for a disengagement report/mile for the beta.

Please stop lying about the supreme capabilities of FSD beta if one can not show the evidence for this!
The FSD beta is neither safe nor trustworthy. It is a system in development. And it's capabilities are supreme.
 
We have quite a lot of evidence that FSD suddenly can't stay in lane or also steer towards oncoming traffic.
Yes, we know. There are literally thousands of posts about this - if only you care to remove your prejudice.

The consistent denial of FSD and AP shortcomings reminds me of the culture shown in "Chernobyl". It is disconcerting.
You are the one in denial. You don't read all the stuff we talk about in other threads detailing all the issues and assume everybody is in denial. Go and read all the posts I've made about all the issues before spouting nonsense.
 
Last edited:
Yes, we know. There are literally thousands of posts about this - if only you care to remove your prejudice.


You are the one in denial. You don't read all the stuff we talk about in other threads detailing all the issues and assume everybody is in denial. Go and read all the posts I've made about all the issues before spouting nonsense.
Ah, progress maybe. So you now admit the shortcomings of FSD beta, so maybe we agree? But you still attack me for misunderstanding your real opinion?
 
Ah, progress maybe. So you now admit the shortcomings of FSD beta, so maybe we agree? But you still attack me for misunderstanding your real opinion?
I'm not "now admitting". You are spouting nonsense based on your ignorance. Thats all.

Have you read all the comments people have been making ? Do you understand the meaning of "early access beta" ?

ps : Why is the OP absconding ? Apparently you are more interested in defending him than he is. I'd be very interested in the log and any police report / breathanalyzer test. Why are Tesla haters not interested in finding actual facts ?
 
Last edited:
There was a mailbox there which as discussed earlier could have been a factor here.

At the time of the swerve across the db lyellow the mailbox was way ahead -- so far ahead that the driver managed to overcorrect and drive to the Right side of the mailbox. It seems likely that the mailbox was NOT what the FSD was planning to give wide berth to by going left.
ample evidence of highly questionable behavior of FSD Beta.

Sure. But I just characterize driving into pretty obvious oncoming traffic as much worse than other instances of highly questionable behavior and was seeking information about the prevalence of this particular sort of extreme error.

For instance the video where it looked like the FSD was going to take a left turn while waiting in a turn lane in front of a fast moving oncoming car revealed upon closer inspection that while it crept out it also calculated going around the oncoming car as indicated in the path planner video.

Just looking to get better information on that particular type of possible error and its causes, or its possible incorrect interpretation of its intended path.
 
I'm not "now admitting". You are spouting nonsense based on your ignorance. Thats all.

Have you read all the comments people have been making ? Do you understand the meaning of "early access beta" ?

ps : Why is the OP absconding ? Apparently you are more interested in defending him than he is. I'd be very interested in the log and any police report / breathanalyzer test. Why are Tesla haters not interested in finding actual facts ?
No agreement then. So it is reasonable to expect that FSD-beta try to crash into oncoming cars? That is the level of capabilities you are satisfied with? Why is it so hard to say FSD-beta s*cks (but it is exciting)?

I have read a ton of your posts on this subforum so just trying to understand what you really mean. It seems you just defend Tesla from any criticism whatsoever. I don't know your vested interests either.
 
and almost never stands up to scrutiny.

There is video. You've created your own story that ignores the initial cause of the overcorrection.

But what does Tesla’s neural networks do when confronted by a King Solomon’s decision ?

It does nothing because there's no portion of their system that attempts such a thing.

"I had an accident using FSD Beta".

More like FSD veered toward an oncoming car that was clearly visible, I panicked, and overcorrected. Again, you're absolutely ignoring the initial condition as though it wasn't the root cause.
 
No agreement then. So it is reasonable to expect that FSD-beta try to crash into oncoming cars? That is the level of capabilities you are satisfied with? Why is it so hard to say FSD-beta s*cks (but it is exciting)?
Why can't you say FSD beta is a great step in AV progression and a nerd's dream come true ?

I have read a ton of your posts on this subforum so just trying to understand what you really mean. It seems you just defend Tesla from any criticism whatsoever. I don't know your vested interests either.
You have done nothing of that sort. You have only read, apparently, my posts in this thread. Go read all my posts in the following threads and stop lying continuoulsy. And then - show me all your posts where you have been positive about Tesla.





And finally I'll leave you with this ..




 
- so far ahead that the driver managed to overcorrect and drive to the Right side of the mailbox. It seems likely that the mailbox was NOT what the FSD was planning to give wide berth to by going left.

I hope you're wrong. The mailbox was two to three seconds away, and you'd definitely hope the system could identify objects that far away (like pedestrians, etc.) which are in plain view of the driver and extremely visible. It's essential!

Any human paying attention would have had no problem identifying that there was an object at the mailbox location prior to the swerve, as the screen capture shows. So FSD must be able to to identify this type of object (and I believe it often can).

For instance the video where it looked like the FSD was going to take a left turn while waiting in a turn lane in front of a fast moving oncoming car revealed upon closer inspection that while it crept out it also calculated going around the oncoming car as indicated in the path planner video.
Not clear which one you're referring to, but the one with the oncoming truck has accompanying video from the cabin which clearly shows a path plotted in front of the oncoming vehicle (which is just feet away).

Definitely take a look at more videos if you have the time. Since you do not have FSD Beta on the vehicle, it does make it difficult to internalize the type of errors it can make, but there's increasing amounts of video evidence out there. More people need to take video from inside the car, so we can eventually capture one of these instances. Of course, we'll always have people who say "it wouldn't have actually made the turn," because with any luck, interventions occur as soon as the steering wheel turns. That being said, watching videos is highly misleading - they make FSD look a lot better than it actually is, in my experience, even if they are completely unedited, unless you're watching VERY closely and know what to look for (focus on speedometer).

(As far as I am concerned as soon as the steering wheel turns, whether it makes the turn or not, that shows that this is possible. But this is just based on my experience with FSD Beta which willingly crosses double yellow lines in low hazard situations, and of course I can't prove it does so in hazardous situations, since I always intervene.)

Anyway all of these types of errors are completely expected. I figure we'll be in this state for another 2-3 years.
 
Last edited:
There is video. You've created your own story that ignores the initial cause of the overcorrection.
That video shows *nothing* to do with FSD or even AP. You have accepted OPs words as gospel with zero evidence.
I'm very clear in my view - we just don't have the facts. Lets look at all the facts before jumping to conclusions.
 
Last edited by a moderator:
Yup.

NHTSA will literally move heaven and earth to recover data from a burned-out Tesla so they can try to pin something on AP/FSD. This car damaged, but intact.

That video shows *nothing* to do with FSD or even AP. You have accepted OPs words as gospel with zero evidence.

You haters are so gullible.

I'm very clear in my view - we just don't have the facts. Lets look at all the facts before jumping to conclusions.
 
In the more recent iterations, I feel like FSD seems to freak out when there is an oncoming vehicle going in the opposite direction. Maybe the other car's headlights blinding the cameras? I've had several instances of my model S trying to go into the oncoming car instead of away from it. I've had to pull it back but luckily i didnt lose control. Thats not to say next time I wont lose control