Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
...
However, I have noticed that the Tesla detection of anything going at 90 degrees to you is also a little suspect. But I am only going by how poorly it gets represented on the screen. Whether in real life my Tesla is fully alert I wouldn’t know (and hope I never have to find out)
Regarding visualization on the screen vs what the car actually sees: I have paid-FSD but not FSD beta att this time. In AP Autosteer, the car is very cautious when approaching a VRU i.e. a pedestrian, bicyclist, wheelchair etc. on the shoulder of the road. Unlike FSD (City Streets) beta, it will not move over to give them more room, but it slows down quite a bit (and far ahead of the VRU) and is reluctant to pass if there isn't plenty of space. The slowing down happens well before the person or bicycle icon appears in the display. So I don't think a non-visualized object is an unseen object for the car. As an added comment I see some delay in the rendering of objects as the car passes them. Perhaps this delay or latency is less in the FSD beta of builds compared to standard Autosteer.
 
  • Informative
  • Like
Reactions: bhzmark and Bouba
Regarding visualization on the screen vs what the car actually sees: I have paid-FSD but not FSD beta att this time. In AP Autosteer, the car is very cautious when approaching a VRU i.e. a pedestrian, bicyclist, wheelchair etc. on the shoulder of the road. Unlike FSD (City Streets) beta, it will not move over to give them more room, but it slows down quite a bit (and far ahead of the VRU) and is reluctant to pass if there isn't plenty of space. The slowing down happens well before the person or bicycle icon appears in the display. So I don't think a non-visualized object is an unseen object for the car. As an added comment I see some delay in the rendering of objects as the car passes them. Perhaps this delay or latency is less in the FSD beta of builds compared to standard Autosteer.
There is some latency in FSD beta also (though, subjectively, it appears less to me). However, it seems from various actions the car has taken while I've been testing it that this is more of a screen display artifact than a delay in the cars internal perception.
 
  • Informative
Reactions: Bouba
Regarding visualization on the screen vs what the car actually sees: I have paid-FSD but not FSD beta att this time. In AP Autosteer, the car is very cautious when approaching a VRU i.e. a pedestrian, bicyclist, wheelchair etc. on the shoulder of the road. Unlike FSD (City Streets) beta, it will not move over to give them more room, but it slows down quite a bit (and far ahead of the VRU) and is reluctant to pass if there isn't plenty of space. The slowing down happens well before the person or bicycle icon appears in the display. So I don't think a non-visualized object is an unseen object for the car. As an added comment I see some delay in the rendering of objects as the car passes them. Perhaps this delay or latency is less in the FSD beta of builds compared to standard Autosteer.
There is some latency in FSD beta also (though, subjectively, it appears less to me). However, it seems from various actions the car has taken while I've been testing it that this is more of a screen display artifact than a delay in the cars internal perception.
Thank you, that is reassuring
 
I'm not sure what point you're trying to make with that. Obviously this person wasn't doing the job they were paid to do, and it reflects poorly on her along with her employers.

Due to her negligence she was charged with a felony.

I would say Uber was negligent too as they didn't have driver monitoring in place. Even Tesla has driver monitoring that would have prevented her from watching a video on her phone.

This failure doesn't take away from the fact that being a safety driver not only takes work, but additional responsibility.
I think of the Uber incident every time I see someone say there should be paid beta testers and Tesla is being careless. I'd say someone like Chuck Cook is going to be a lot safer beta testing than someone being paid minimum wage who has no real interest in seeing autonomous cars succeed. Of course, there's always going to be idiots who misuse the FSD beta but the new ejection policy should help curtail that.

These days, I'm more worried about some anti-Tesla nutjob staging an accident. This incident comes to mind:
 
  • Disagree
Reactions: FlatSix911
I can say that if you were relying on Volvo’s own detection system to save your life then you are in big trouble. The Volvo system is fragile and next to useless.
But the big spinning Uber thing on the roof should have picked up the pedestrian as it seems to spin 360 degrees like a ship’s radar.
However, I have noticed that the Tesla detection of anything going at 90 degrees to you is also a little suspect. But I am only going by how poorly it gets represented on the screen. Whether in real life my Tesla is fully alert I wouldn’t know (and hope I never have to find out)
Volvo claims that they ran simulations of the incident, using data from Uber's sensors, and their AEB system would have alerted the driver and probably avoided the collision on its own (17 out of 20 times with the other 3 being reduced to <10mph).
Here's the full NTSB report: https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1903.pdf
 
  • Informative
Reactions: FlatSix911
  • Disagree
Reactions: FlatSix911
Volvo claims that they ran simulations of the incident, using data from Uber's sensors, and their AEB system would have alerted the driver and probably avoided the collision on its own (17 out of 20 times with the other 3 being reduced to <10mph).
Here's the full NTSB report: https://www.ntsb.gov/investigations/AccidentReports/Reports/HAR1903.pdf
Here’s the summary of of the 78 page report

3.2 Probable Cause
The National Transportation Safety Board determines that the probable cause of the crash in Tempe, Arizona, was the failure of the vehicle operator to monitor the driving environment and the operation of the automated driving system because she was visually distracted throughout the trip by her personal cell phone. Contributing to the crash were the Uber Advanced Technologies Group’s (1) inadequate safety risk assessment procedures, (2) ineffective oversight of vehicle operators, and (3) lack of adequate mechanisms for addressing operators’ automation complacency—all a consequence of its inadequate safety culture. Further factors contributing to the crash were (1) the impaired pedestrian’s crossing of N. Mill Avenue outside a crosswalk, and (2) the Arizona Department of Transportation’s insufficient oversight of automated vehicle testing.


The pedestrian was on drugs and the driver on the phone.....
 
I own a lot of Tesla stock. As long as they are conducting this glorious experiment on public roads, I would like them to do everything in their power to make their automation safe, as an accident that their automation contributes to will not be good for business or their market cap. They should:

1) Study blended steering and see if it is safer for avoiding overcorrection than the current implementation. They have data on this already.
2) Find a way to ensure drivers keep at least one hand on the wheel in a high leverage position (i.e. not the bottom where you have a short lever arm) at all times. Use new sensors if you must. Retrofit if you must.
3) Kick people out for not paying attention even in the slightest way. Make this much more strict. It’s actually quite good now, but now that they have the capability, make it better. Tell the owner what they are doing, and tell them to stop. But don’t give too much leeway. Retrofit vehicles that do not have cabin cameras.
4) Stop steering into oncoming traffic. This is probably the hardest task as it likely involves perception, but I believe they can do better.
5) Address any other known safety issues highlighted by their access to the crash data on their vehicles.
6) Find a way to prevent road departure accidents like this through the use of active driver assistance. Why did the car allow itself to be steered off the road? Why does the capability to avoid such accidents not exist? It’s a very difficult problem, because you don’t want to override driver authority, but I am not convinced it is more difficult than FSD. Can they improve on their emergency lane departure feature? What if someone has a seizure or heart attack and tries to drive the car off the road?

There are always trade offs. Chances are the resources spent on some of the above (1-3) will help the progress short term but impair it long term.

The rest (3-6) are, probably, already being done by Tesla to the best of their abilities and your impression that they are not doing so is just your impression.

Maybe Tesla could do something better or maybe they already do what they believe is the best for overall outcome...
 
Last edited:
This is Europe. The US probably a different planet.
Yea, in US there's still no mandate for the side "barriers" on the semi trailers. That one much publicized underride death in the tesla might have been prevented had the "barrier" been installed on the trailer. Yet, all the noise was about Tesla and none of it about lack of the "barrier" even though installing the latter is much easier and cheaper...
 
  • Like
Reactions: bhzmark
I think of the Uber incident every time I see someone say there should be paid beta testers and Tesla is being careless. I'd say someone like Chuck Cook is going to be a lot safer beta testing than someone being paid minimum wage who has no real interest in seeing autonomous cars succeed. Of course, there's always going to be idiots who misuse the FSD beta but the new ejection policy should help curtail that.

These days, I'm more worried about some anti-Tesla nutjob staging an accident. This incident comes to mind:
You're using exaggerated examples.

Chuck Cook is not a good example of what an average FSD Beta owner is. Can you seriously look at what he's done, and say that doesn't deserve compensation? Not only is he not being paid, but the FSD Beta team doesn't care about him. The very ejection policy you're referring to very well might get him ejected from the program. The last time I saw an update he was on strike 2 out of 3. The ejection policy is based on driver monitoring that is in an early stage of beta, and Tesla isn't giving the customer the means to properly test it. We don't know how prone it is to false positives or false negatives.

Plus safety drivers are not being paid minimum wage. Now sure their pay isn't high, but we're talking at least $20/hr. Not only do they get paid, but if a crash occurs its not their car. The only way they can have liability for an accident is if they do something extremely idiotic like watching a video on a phone which completely defeats their enter purpose.

The lack of pay, and having to take all the responsibility isn't even the biggest problem. The biggest problem with the FSD Beta program is the lack of communication between the FSD Beta team, and the customer. This lack of interaction means that the best FSD Beta testers are not being leveraged to full advantage. So not only is there no monetary gain to be had, but there isn't even the satisfaction of getting a bug squashed.

It means that hu-hum FSD Beta owners like myself have given up even bothering with it. Why report issues if it just goes into the ether?

The only purpose to what Chuck Cook, and others put themselves through is for our own entertainment. To see how its progression from one version to the next.

The lack of any purpose except intellectual curiosity of owners hardly justifies the risk. The risk is not just the idiots doing dumb things, but people like the OP who simply failed a moment.

I'm not worried about stage incidents. I think autonomous vehicle testing/development always brings out the crazies, and that's to be expected. What I'm worried about is driver inattention as FSD Beta gets better. As owners we're looking for functionality, and where it works well. Once we've established use cases then the trust starts to build. Once trust is built its really hard for a human to maintain the same level of vigilance. That's when the really bad accidents will happen.
 
Can I ask, why is it that before I bought FSD all I saw were videos about how great it is. And now all I hear is how stressful and dangerous it is...
I'm confused.

It says you're in France, but I though FSD wasn't available in Europe.

Anyways when it comes to the FSD Beta there is a group of "influencers" who's videos make FSD Beta look like its really great.

Then there are other more reliable FSD Beta testers like Chuck Cook who do a reasonably good job showing the good and bad.

Even before the flood gates were open Chuck Cook cautioned us about how stressful and dangerous it was. And, once it was released this aspect was readily apparent to most anyone using it.
 
  • Like
Reactions: drtimhill
I'm confused.

It says you're in France, but I though FSD wasn't available in Europe.

Anyways when it comes to the FSD Beta there is a group of "influencers" who's videos make FSD Beta look like its really great.

Then there are other more reliable FSD Beta testers like Chuck Cook who do a reasonably good job showing the good and bad.

Even before the flood gates were open Chuck Cook cautioned us about how stressful and dangerous it was. And, once it was released this aspect was readily apparent to most anyone using it.
It’s available to buy (€10,000) but I am not aware of any testing here (but I could be wrong). By the way, I’m only on 10.2
 
  • Informative
Reactions: S4WRXTTCS
By the way, I’m only on 10.2

The 10.2 you are on is nothing to do with FSD City Streets beta. It is the standard cars software version and 10.2 has been the version for about the last 2 years. v10 came around Sep 2019 with 2019.32.11?, 10.1 and 10.2 quickly followed, which is where it has for whatever reason remained. Most people use the year, month, build designation. FSD beta has started to refer to versions again, but the FSD 10.6.1, 10.8 etc is not the same versioning at your 10.2.
 
  • Informative
Reactions: JHCCAZ