Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
If it's possible to accidentally disengage AP/FSD mid turn in a way that even very briefly sends you careening forward into an oncoming lane of traffic, it seems like a design flaw that should be addressed/mitigated. Situations like this would be why the NHTSA asks for reports including chunks of time before an accident to see if and when the system was disengaged, the sequence of events that led up to it, etc
The only ways for what you describe to happen are:

1. Driver is applying torque to the outside of the curve and disengages FSD (MAJOR DRIVER ERROR)
2. drivers hands are not on the wheel and they hit the brakes causing the care to continue on a tangent into the other lane (ANOTHER DRIVER ERROR)

Neither of these are AP/FSD fault, just driver error.
 
Just crossing the yellow line on turns could happen in AP but very rare. Even FSD doesn’t cross yellow line on slight bends. There is definitely more to the story than what OP says. For all we know AP/FSD was disengaged before the car started crossing the line.
Well I’ve certainly had beta every time on a curved road hit the reflectors and ride the outside center yellow line….that is scary and other cars are surely freaking when they see me nearly coming over. The car should be designed on a curve to bias slightly to the inside of the curved lane. Or at least keep the center of the lane!!!
 
Huh? The car veered about 2-3 seconds before the mail post. If that was a false detection of a pedestrian (conceivable I think), it could explain the behavior (even though it was completely the wrong behavior). That is exactly when you would expect a reaction.

Note that you would not easily be able to see any slowing in the video for a pedestrian detection.

Not saying that is what happened, but it certainly seems possible, and the timing would fit.

If the OP could provide location (won’t of course) then someone could drive that road at night and look at the visualizations. Or drive similar scenarios.

Certainly, I have seen many false pedestrian detections before (they have been partially resolved in the latest software), so it seems possible. This did take place on 10.4 which I think is where I saw them.
I think the misidentified pedestrian theory is definitely likely. I’ve had a lot of mailboxes shown as pedestrians on the screen on suburban/rural roads like this. It caused phantom braking, not swerving in my case, but I may have gotten lucky.

I also wonder the more I watch this whether the driver could have accidentally disengaged autosteer but not TACC. It looks more like a drift than an explicit swerve out of the lane.
 
The only ways for what you describe to happen are:

1. Driver is applying torque to the outside of the curve and disengages FSD (MAJOR DRIVER ERROR)
2. drivers hands are not on the wheel and they hit the brakes causing the care to continue on a tangent into the other lane (ANOTHER DRIVER ERROR)

Neither of these are AP/FSD fault, just driver error.
Perfect, now we can ask if anything else can be done to eliminate or further reduce the likelihood of this error occurring to prevent similar situations in the future
 
Other drivers didn't signup to be Guinea Pigs to Tesla 's quest for robotaxis. If the Tesla would have hit that car there would have been a Huge lawsuit. Regardless of BETA testing and tesla saying drivers are responsible if the car crashes. It is Tesla and the driver who will be on the hook for damages. There are so many videos of FSD beta messing up I'm surprised Tesla lets drivers actually video tape it. Why was the video of this incident taken down? Maybe it shows Tesla in a very bad spotlight. I really don't care if a Tesla driver wrecks his car due to correcting a FSD mistake. I do care when people could get hurt. Including Tesla drivers.
While both Tesla and the driver would have been sued, it isn't a slam dunk that Tesla would ultimately be held liable in any regard. They have a number of defenses and the company would certainly argue that the driver's over correction of the steering is what caused the accident. (Also, I'm sure they would argue that there isn't sufficient evidence to conclude that the car would have even caused an accident if the driver had not intervened, as the car had adequate time to correct itself.)

I am not saying these arguments would definitely absolve Tesla, but they would have a better than 50/50 shot of not paying any damages or paying less than half of the damages (depending on the laws of the state where the accident happened). Regardless, Tesla (or any other car manufacturer) would not want the negative press.

As the AP/FSD technology becomes more sophisticated and more commonplace with other manufacturers, I'm sure we will see a number of these lawsuits and it will be interesting to see how those cases are resolved (assuming they don't settle).

Thankfully, no one was hurt in this instance.
 
Not crossing double yellow lines onto on oncoming car has to be one of the simplest tasks for driverless. I doubt we will see a case like this in court, because a strong fix will be in place. Level 2 accidents where the driver is in control will be complicated if one reaches a court.
 
The more people this gets rolled out to, the more incidents are going to occur. Regulators won’t put up with it much longer. Then we all lose it…and that will slow down or halt fsd development progress.
Not necessarily. Regulators could respond by passing regulations, which could increase the adoption of self-driving technology and speed development up.
 
  • Like
Reactions: CarbonPear
The more people this gets rolled out to, the more incidents are going to occur. Regulators won’t put up with it much longer. Then we all lose it…and that will slow down or halt fsd development progress.
Regulators don't start percolating until there are deaths or serious injuries. I'm thinking we can have dozens of fender benders and it will dust right off. Also have to compare with number of accidents saved. If Tesla comes out and says FSD drivers were in less accidents than non FSD drivers , then it doesn't matter number of incidents, since it would have saved more incidents than those that occurred.
 
Giving the OP the benefit of the doubt that everything we see from the front dash footage up to the driver swerve was FSD beta, I think there is merit to FSD mistaking the mailbox as a pedestrian/VRU and going over the center line to provide clearance. I've had the car inappropriately try to pass (oncoming car is present) without attempting to slow down first.

I know each of our driving abilities and experiences are different, and what I'm about to say only applies to myself. I have been in several situations very similar to what the OP describes, but more common during rain. Even without a false positive VRU detection, the car loses sight of the lines and crosses over the double yellow. While the FSD mistakes are abrupt, I'm fully alert and can disengage without any change to my resting heart rate :) I suspect the OP was not fully alert, and that increased how much he compensated to disengage.

I have a hard time seeing how suing Tesla would produce a desired outcome, given the acceptance of terms in the UI before you can enable FSD beta. What happened here is completely covered by those terms.
 
  • Like
Reactions: Musashi6
I’ve had multiple incidents like OP described / the video. Easily reproducible on narrow lane roads where the car appears to steer at traffic and sometimes repeatable without traffic.

Crossing double yellow lines happens. I’ve had it do it to avoid an SUV which turned in front of me. A normal driver would hit the brakes hard, FSD beta couldn’t see the approaching traffic and assumed it was safer to try and go around facing opposing traffic. I don’t envy the choices Tesla has to make in situations like this once it becomes more refined.

I like my FSD beta but I don’t trust it - yet. It’s improved MASSIVELY in functionality in the time I’ve had it but the reliability / repeatability has me concerned.
 
  • Like
Reactions: CarbonPear
I'm always suspicious of posts like these when the OP does not come back to respond at all. High or low count poster isn't even relevant. A post like this from the OP is meant to draw discussion. There are lots of questions being raised that the OP could help bring clarity to the situation yet they remain silent.................. ❓
Yes - this is the most important aspect of this thread. The only post of this member is a video claiming FSD tried to kill them. They did post some comments on youtube which were completely omitted in OP - and then the video got deleted.
 
What ?!

You mean it did not happen exactly as described in OP ?

There are so many possibilities - without the interior camera it’s difficult to tell. But it sounds like he got a FCW and not a nag. But he thought it’s just a nag and he tried to jerk the wheel harder causing the swerves we see ?

BTW, did OP file a police report ?
I believe the YouTube comments referred to in that post are not the OPs comments, but the comments from one of the FSD you tubers.

E876D197-1DF4-4F28-922F-C8099FC85D30.jpeg



The comments on the OP YouTube are documented here:
Post in thread 'FSD Beta Attempts to Kill Me; Causes Accident'
FSD Beta Attempts to Kill Me; Causes Accident
 
I'm sorry this happened to you. I have stated repeatedly that, on 2 lane roads, FSD hugs the center-line on straight-aways and crosses it on right-hand curves and I find myself playing chicken with oncoming traffic. Others have said the same, and there is a whole thread about it. Until this is resolved, I will not use FSD on winding, 2 lane roads. It is far too dangerous and Tesla really needs to address it. I don't know why they can't have the car track in the center of the lane.
 
I believe the YouTube comments referred to in that post are not the OPs comments, but the comments from one of the FSD you tubers.
Oh - I thought it was referring to OP.

This was about AI Addict - totally different situation.

So, I guess we are back to square one. No idea what actually happened - and the OP will not post another comment on this and has deleted the video.
 
this lends some believability to the report of the first FSD Beta accident

If you don't hit the brake, TACC remains engaged. The video from Addict makes it look like FSD steering was disengaged, TACC was still engaged, and the alert on screen kept playing because Tesla's software has TONS of bugs in it.

I’d like to see evidence that AP/FSD was even engaged at time of the double yellow crossing so abruptly.

[FSD Beta 8.2] Oakland - Close Calls, Pedestrians, Bicycles!
Tesla Full Self-Driving running a red light
Tesla FSD Beta 10.2 Fails simple .5 mile drive In Sacramento, CA 😭
Tesla FSD Beta 10.3.1 Fail 🤦‍♂️
Doing DoorDash on FSD Beta
Tesla FSD beta 10.3.1 - that was close!
[FSD Beta 10.2] San Jose Downtown Stress Test
FSD Beta v10.0 First Drive & Impressions | Dog, Stops, Peds Xing | 2021.24.15 FSD Beta 10

Would you like more examples, or does that cover it?

A prime directive of any flavor of AP/FSD should be to avoid, and certainly not cause, a head-on collision.

Sadly, Tesla's behavior has been getting worse since 8.x two years ago.

Also, FSD usually would also slowdown when going around an obstacle.

We've seen countless videos of it jerking the wheel into oncoming traffic. I've shared many above, feel free to point out which ones it hit the brakes for.
 
I'm always suspicious of posts like these when the OP does not come back to respond at all. High or low count poster isn't even relevant. A post like this from the OP is meant to draw discussion. There are lots of questions being raised that the OP could help bring clarity to the situation yet they remain silent.................. ❓
What question could he possibly answer?
I'm guessing what happened is either Tesla agreed to pay for damages and he signed an NDA or he got an attorney and they told him to take it down and not discuss the case.
Neither of these are AP/FSD fault, just driver error.
Is it possible for collision to be AP/FSD's fault?
He claims he was holding the wheel. It seems the issue is that people never practice what to do when their car steers into oncoming traffic.
 
Last edited:
  • Like
Reactions: daktari and Matias