Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
It’s available to buy (€10,000) but I am not aware of any testing here (but I could be wrong). By the way, I’m only on 10.2
Just for reference Tesla makes it DAMN confusing with the dumb overlapping nomenclature.

FSD Compatibility option package
FSD Beta is the Full Self Driving testing program with software

10.2 is the overall software user interface (UI) number
10.x (in this forum) is the Beta number. All Beta testers are also on UI 10.2 it is just replaced with the word Beta and unrelated to the Beta version number.

EDIT: VanillaAir beat me by a few seconds.

 
  • Informative
Reactions: JHCCAZ
Just for reference Tesla makes it DAMN confusing with the dumb overlapping nomenclature.

FSD Compatibility option package
FSD Beta is the Full Self Driving testing program with software

10.2 is the overall software user interface (UI) number
10.x (in this forum) is the Beta number. All Beta testers are also on UI 10.2 it is just replaced with the word Beta and unrelated to the Beta version number.

EDIT: VanillaAir beat me by a few seconds.

I’m sure that on my screen it calls it ‘Beta’
 
  • Like
Reactions: zigmeister
Even basic AP auto steer says (Beta) and every customer has that.

But this could be just an AP issue although Beta is enabled and that stack should be in control. But autosteer has a history of hugging outside turn lines as well, nothing new.

They just need to improve bias by 6% for inside side of lane and problem solved!
 
So Beta is available in Europe? OP's info says Missouri.

Very strange.

edit. This is a lie. You can tell by the screen that it isn't Beta.

Even basic AP auto steer says (Beta) and every customer has that.

But this could be just an AP issue although Beta is enabled and that stack should be in control. But autosteer has a history of hugging outside turn lines as well, nothing new.

They just need to improve bias by 6% for inside side of lane and problem solved!
It's AP. He doesn't even have the beta visualizations or screen layout.
 
Last edited:
Won't EVER happen. Just like the windshield wipers. How could doing so benefit Tesla? There's clear benefit to calling it beta forever, or at least for many, many, many years.
Depends on what Beta you are talking about. If talking about the (Beta) label, I agree it'll probably be always there.

If talking about the public beta program, it may leave that eventually and simply become city streets (without having to jump through all the hoops you have to right now for FSD Beta access).
 
Consider that the Captain of a jumbo jet with 400+ people, utilizing Auto Pilot makes a full Auto Land on a runway in almost Zero Visibility, yet is always responsible and required to be able to take over manually at any time and safely fly the airplane regardless of conditions. The airplanes autopilot is only required to operate vertically and horizontally and is only capable of a fraction of what the Tesla Beta FSD can do. Aviation Full Self Flying has more or less been abandoned in the last 25 years, due to obvious catastrophic outcome and liability in the event of even the slightest automation failure. The bottom line is, I think we will never be in an automated driving condition, or FSD where the driver is not ultimately and always fully liable and required to be in command . Personally I do not foresee a safe Full Self Driving auto pilot for the masses in the near future. It’s just an opinion….
 
  • Like
Reactions: FlatSix911
Consider that the Captain of a jumbo jet with 400+ people, utilizing Auto Pilot makes a full Auto Land on a runway in almost Zero Visibility, yet is always responsible and required to be able to take over manually at any time and safely fly the airplane regardless of conditions. The airplanes autopilot is only required to operate vertically and horizontally and is only capable of a fraction of what the Tesla Beta FSD can do. Aviation Full Self Flying has more or less been abandoned in the last 25 years, due to obvious catastrophic outcome and liability in the event of even the slightest automation failure. The bottom line is, I think we will never be in an automated driving condition, or FSD where the driver is not ultimately and always fully liable and required to be in command . Personally I do not foresee a safe Full Self Driving auto pilot for the masses in the near future. It’s just an opinion….
This is somewhat apples to oranges. First, with 400 people on board, an airplane crash is FAR more serious than a car crash. Sure, cumulatively over many crashes, more people die on the roads than in airplanes, but the point is it takes only ONE error to (potentially) kill all 400 passengers+crew. Also, airplane crashes, by their nature, tend to be much more catastrophic, on average, whereas most car crashes tend to be of the fender-bender variety .. even in freeway crashes fatalities (though tragic) are rare (there are just so many crashes that the occasional fatality adds up).

The net result is that, like it or not, the bar is lower for cars. Is it low enough that FSD can jump over it? I dont know, and I dont think anyone else does yet, but if someone doesnt try, we will NEVER know. A lot of people are saying it can't be done, but apart from some vague "its much harder than you think" stuff I've net seen any convincing technical arguments as to why not. My feeling is the problem is hard, and probably underestimated by many (maybe including me), but not impossible or infeasible.
 
The problem is probably more philosophical than physical....the FSD is going to be safer than human driving...perhaps by an enormous margin...but that accident that the FSD does have isn’t going to be the same accident that the human would of had if he were driving...so it is solely a FSD accident and that is unacceptable to society....the safety record isn’t a license to have an occasional accident
 
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

View attachment 755700


This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
This is really odd. I don't have FSD and will probably never purchase it, but my Basic Autopilot here in France/Europe is just terrible. Key issue for me are the very dangerous phantom brakes, that appeared in my Model X 100D about 2 years ago, after an OTA update of the car SW. Each OTA since Sept 2017, when I bought my car, has downgraded its capabilities instead of improving them. And their european maps are just terrible with wrong speed limits all over the place, 2 years after their last maps update here. After 4.5 years I've had mine, this has become embarassing. When you see TSLA share valued as if the robotraxis were already here, or FSB price propelled above 10k€..., This is just scary non sense, that will end badly. At Tesla Superchargers here, the historical Tesla S & X owners are loosing faith. Me included to be honest. Please Wake up, back to earth, Elon !
 
  • Like
Reactions: daktari
Having watched the clip I’m left with one troubling thought… Without waiting to see if AP/FSD resolved the situation at the last second which would be far more dangerous, at what point in time can you argue a driver assist feature put another driver or it’s occupant in “danger” as opposed to the driver?

The warnings about remaining in control are clear and frequent, but that only means something until the day a legal precedent is set that they don’t.
 
The problem is probably more philosophical than physical....the FSD is going to be safer than human driving...perhaps by an enormous margin...but that accident that the FSD does have isn’t going to be the same accident that the human would of had if he were driving...so it is solely a FSD accident and that is unacceptable to society....the safety record isn’t a license to have an occasional accident
Agreed, and people tend to react emotionally to a single accident rather than look at the overall statistics. However, this is mostly a familiarity thing: after all, we all tolerate the relatively high accident risks of normal human driving for the convenience of personal car travel.
 
  • Like
Reactions: Casss
In its deployment as a Level 2 ADAS, for sure

I don't think there could be a Robotaxi scenario that doesn't involve Tesla taking liability for the driving task and whatever results
If we get to level 5 robotaxis then they will be covered by insurance like any car...except because the accident rate is low, the premiums will be very cheap
 
Agreed, and people tend to react emotionally to a single accident rather than look at the overall statistics. However, this is mostly a familiarity thing: after all, we all tolerate the relatively high accident risks of normal human driving for the convenience of personal car travel.
Couldn’t agree with this more.

I think we can all agree that this stuff is still in its infancy (relatively speaking considering how long we’ve had access to personal vehicles).

However, one thing I do like about AP/FSD is that so far I haven’t seen any Teslas on AP road raging of their own volition or using the car as a weapon, which is a behaviour people display with alarming regularity. Some things are worth working at.