Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
While I feel bad for the accident I can't accept that FSD Beta is the primary culprit.

The primary culprit is the driver failed to oversee FSD Beta, and that failure directly led to the accident.

So the next step to ask ourselves is whether the driver was truly capable of doing that job?

When I test FSD Beta I use both hands, and I'm more "on it" than I would ever be while driving. Why is that? It's because FSD beta did its freak out on a perfectly straight road in perfect condition. That single moment taught me just how capable it was to kill me if I didn't instantly tell it no.

The OP could have easily been me had I been as relaxed as I was when I initially tested the FSD Beta.

Tesla wasn't astute in selecting beta testers. The safety score wasn't a score of whether you had driving skills or the ability to keep the cars computer from killing you off. Instead it was this silly little game that if you understood the rules of the game you could do well on. It's like we were monkey's that got a special treat if we did what was expected.

They say its a beta, but a beta has things like bug reports and follow ups. Has Tesla ever followed up with you on anything you reported? Have they ever indicted caring about what you report? People on TMC have discovered that map issues have to be fixed to even have a remote chance at something useful so they're on a mission to figure out how to fix them, but contacting Tesla isn't even in the game plan.

I personally have mixed feelings about this whole FSD Beta thing.

On the one hand its exactly why I bought FSD. To get a chance to experience autonomous driving even if its highly supervised.

On the other hand its exhausting, and a cycle of excitement followed by disappointment has emerged. Like right now I'm excited for 10.7 despite the fact that its only been a little bit since 10.6.1 so its probably not going to be that much better.

The bottom line is it simply wasn't ready for 20K people to use. FSD Beta testing is a job that deserves a paycheck, and a team that will listen to issues being reported. A job you have do to in solitude as no one with any self preservation would dare be in the car with you.
 
That video is horrific !
But there are still questions I’d like to know....for example, what is the speed limit of the road that the cyclist walked across at night wearing black? Did the car have FSD, and if yes, was it on?
Clearly the driver was not paying attention, but the guy crossing didn’t look like he was doing himself any favors either.
On a dark fast road, any car on any kind of cruise control would have hit the person crossing. Also the driver was clearly not fully attentive (that spy in the cab camera will ruin your day in court) but he appeared to look up and to his left about the time the pedestrian would have started crossing from the left. He didn’t see him, possibly because it was a dark road and the pedestrian was wearing dark clothes.
All in all a very sad situation. Does anyone know what happened to the driver?
 
That video is horrific !
But there are still questions I’d like to know....for example, what is the speed limit of the road that the cyclist walked across at night wearing black? Did the car have FSD, and if yes, was it on?
Clearly the driver was not paying attention, but the guy crossing didn’t look like he was doing himself any favors either.
On a dark fast road, any car on any kind of cruise control would have hit the person crossing. Also the driver was clearly not fully attentive (that spy in the cab camera will ruin your day in court) but he appeared to look up and to his left about the time the pedestrian would have started crossing from the left. He didn’t see him, possibly because it was a dark road and the pedestrian was wearing dark clothes.
All in all a very sad situation. Does anyone know what happened to the driver?

It‘s kinda off topic, but you can look up Uber autonomous death on Google.
 
That video is horrific !
But there are still questions I’d like to know....for example, what is the speed limit of the road that the cyclist walked across at night wearing black? Did the car have FSD, and if yes, was it on?
Clearly the driver was not paying attention, but the guy crossing didn’t look like he was doing himself any favors either.
On a dark fast road, any car on any kind of cruise control would have hit the person crossing. Also the driver was clearly not fully attentive (that spy in the cab camera will ruin your day in court) but he appeared to look up and to his left about the time the pedestrian would have started crossing from the left. He didn’t see him, possibly because it was a dark road and the pedestrian was wearing dark clothes.
All in all a very sad situation. Does anyone know what happened to the driver?
That's the infamous Uber Volvo killing a jaywalker at the night. The fleet was taken off the road. My opinion is that this accident is completely the victim's fault. It is very likely a human driver would not see the jaywalker until the last second and do nothing different than the automatic driving system. I think that accident was one a big setback against car automation programs, sadly, for a wrong reason.
 
That's the infamous Uber Volvo killing a jaywalker at the night. The fleet was taken off the road. My opinion is that this accident is completely the victim's fault. It is very likely a human driver would not see the jaywalker until the last second and do nothing different than the automatic driving system. I think that accident was one a big setback against car automation programs, sadly, for a wrong reason.
You shouldn't trust Uber's deceptive video.
In simulating conditions of the crash and consulting their textbooks, detectives found that Herzberg could have been seen 143 feet down the road by 85 percent of motorists. The Volvo was traveling at about 40 mph in the lane nearest the curb and never braked. If Vasquez had braked, she would have given Herzberg an extra .57 seconds of time to cross in front of the vehicle. Police further determined that in that half-second, Herzberg would have walked an extra 2.64 feet. She would have only needed 2.1 feet to make it past the Volvo.

"For this reason, the crash was deemed avoidable," the report states.
Police report is at the bottom.
 
This is all news to me....it’s a Volvo? So nothing to do with Tesla. Granted, gives all self driving a bad name....but it says 80% would have seen her? So 20% wouldn’t ? It’s not hard to be in the 20%.
Also the speed, and missing her by a fraction of a second.....it says 40mph, was that the speed limit or the car’s speed ? I mean, for most cars, going 40 could mean anything from 35-45...that half second could be lost in many ways.
Of course, the clincher is the in car camera...you can’t fight that, no jury is going to see beyond that and he was driving without paying full attention. He’s guilty
 
This is all news to me....it’s a Volvo? So nothing to do with Tesla. Granted, gives all self driving a bad name....but it says 80% would have seen her? So 20% wouldn’t ? It’s not hard to be in the 20%.
Also the speed, and missing her by a fraction of a second.....it says 40mph, was that the speed limit or the car’s speed ? I mean, for most cars, going 40 could mean anything from 35-45...that half second could be lost in many ways.
Of course, the clincher is the in car camera...you can’t fight that, no jury is going to see beyond that and he was driving without paying full attention. He’s guilty
It's a self-driving car with data logs, they know exactly what happened.
I'm a little bit scared that 15% of drivers have eyesight so poor that they can't stop for a person in the road on a well lit suburban street while going 40mph...
 
This is all news to me....it’s a Volvo? So nothing to do with Tesla. Granted, gives all self driving a bad name....but it says 80% would have seen her? So 20% wouldn’t ? It’s not hard to be in the 20%.
Also the speed, and missing her by a fraction of a second.....it says 40mph, was that the speed limit or the car’s speed ? I mean, for most cars, going 40 could mean anything from 35-45...that half second could be lost in many ways.
Of course, the clincher is the in car camera...you can’t fight that, no jury is going to see beyond that and he was driving without paying full attention. He’s guilty
Yes, this was the car.
1639855627084.png
 
It's a self-driving car with data logs, they know exactly what happened.
I'm a little bit scared that 15% of drivers have eyesight so poor that they can't stop for a person in the road on a well lit suburban street while going 40mph...
I would most likely not try to stop for a person who is jaywalking and enters traffic unexpectedly and unsafely. It is NOT reasonable to stop for any pedestrian that can potentially jump in the road in front of your car. When I see a person on a side of a 5-lane road, I do NOT expect that person will walk in a path of my car. The driver may have been inattentive. But how about the pedestrian?
 
I would most likely not try to stop for a person who is jaywalking and enters traffic unexpectedly and unsafely. It is NOT reasonable to stop for any pedestrian that can potentially jump in the road in front of your car. When I see a person on a side of a 5-lane road, I do NOT expect that person will walk in a path of my car. The driver may have been inattentive. But how about the pedestrian?
That's not what happened.
page1image1447846304

NTSB report:
 
If that’s the scene of accident, I would blame the town planner. There is an X shaped foot path that goes nowhere, despite it meeting the hi way in four places. There are no crossings marked on the roads. That X is literally leading you up the garden path
Yeah, that's horrible planning, what the heck was the point of that pathway? It's just encouraging jaywalking into traffic. According to the report there are signs warning pedestrians to use the crosswalk, but at night people may not notice the signs (or even heed to them if noticed) as the design is an open invitation to walk through.
 
The primary culprit is the driver failed to oversee FSD Beta, and that failure directly led to the accident.

So the next step to ask ourselves is whether the driver was truly capable of doing that job?

When I test FSD Beta I use both hands, and I'm more "on it" than I would ever be while driving. Why is that? It's because FSD beta did its freak out on a perfectly straight road in perfect condition. That single moment taught me just how capable it was to kill me if I didn't instantly tell it no.

The OP could have easily been me had I been as relaxed as I was when I initially tested the FSD Beta.

Tesla wasn't astute in selecting beta testers. The safety score wasn't a score of whether you had driving skills or the ability to keep the cars computer from killing you off. Instead it was this silly little game that if you understood the rules of the game you could do well on. It's like we were monkey's that got a special treat if we did what was expected.

They say its a beta, but a beta has things like bug reports and follow ups. Has Tesla ever followed up with you on anything you reported? Have they ever indicted caring about what you report? People on TMC have discovered that map issues have to be fixed to even have a remote chance at something useful so they're on a mission to figure out how to fix them, but contacting Tesla isn't even in the game plan.

I personally have mixed feelings about this whole FSD Beta thing.

On the one hand its exactly why I bought FSD. To get a chance to experience autonomous driving even if its highly supervised.

On the other hand its exhausting, and a cycle of excitement followed by disappointment has emerged. Like right now I'm excited for 10.7 despite the fact that its only been a little bit since 10.6.1 so its probably not going to be that much better.

The bottom line is it simply wasn't ready for 20K people to use. FSD Beta testing is a job that deserves a paycheck, and a team that will listen to issues being reported. A job you have do to in solitude as no one with any self preservation would dare be in the car with you.
Your story reminds me of the feelings I had with AP2, winter 2017 and on to summer 2018, waiting for new updates to achieve "AP1 parity".
 
  • Like
Reactions: FlatSix911

I'm not sure what point you're trying to make with that. Obviously this person wasn't doing the job they were paid to do, and it reflects poorly on her along with her employers.

Due to her negligence she was charged with a felony.

I would say Uber was negligent too as they didn't have driver monitoring in place. Even Tesla has driver monitoring that would have prevented her from watching a video on her phone.

This failure doesn't take away from the fact that being a safety driver not only takes work, but additional responsibility.
 
If that’s the scene of accident, I would blame the town planner. There is an X shaped foot path that goes nowhere, despite it meeting the hi way in four places. There are no crossings marked on the roads. That X is literally leading you up the garden path
The US treats people who bike or walk like second class citizens so its not really that much better elsewhere.

I walk a lot, and I've seen something of the truly dumbest things. At my work a sidewalk suddenly ends, and so you have to decide to walk on the street or cross over 4 lanes of traffic plus a medium to get to the side walk on the other side.

In the US you have to be really careful with pedestrians as our infrastructure forces them to do some stupid things unless they'll willing to walk way out of their way.
 
The first is you can't do subtle steering correction. Most of the corrections I want to do are mild where its going to far one way or another, but there is no way to correct it without taking over completely which is often more jarring than I want.
I have no problem taking over without any jerk. I, actually, put the steering setting on the 'sport' - otherwise it is way too easy to disengage without noticing it.

The second is if you take over using steering that it doesn't cancel out of TACC. So there is no full take over unless you steering, and slightly brake at the same time.
This one I wholeheartedly agree. I wish there was an option you could set to have it disengage completely in this case. I've posted elsewhere couple of scenarios where current behavior is dangerous.
 
....it’s a Volvo?

It's extremely important to understand that the Volvo was being used by Uber as simply a chassis to install Uber's own autonomous system. All standard and optional safety features, including Volvo's pedestrian detection and avoidance, were removed or disabled by Uber so as not to interfere with the test platform.

So while indeed it was an automobile chassis originally produced by Volvo, it was not any Volvo you can buy. It was significantly crippled compared to those, before being "enhanced" again by Uber. Very unfortunate coverage for Volvo as they were one of the leaders in the introduction of pedestrian safety features.
 
  • Like
Reactions: superblast
It's extremely important to understand that the Volvo was being used by Uber as simply a chassis to install Uber's own autonomous system. All standard and optional safety features, including Volvo's pedestrian detection and avoidance, were removed or disabled by Uber so as not to interfere with the test platform.

So while indeed it was an automobile chassis originally produced by Volvo, it was not any Volvo you can buy. It was significantly crippled compared to those, before being "enhanced" again by Uber. Very unfortunate coverage for Volvo as they were one of the leaders in the introduction of pedestrian safety features.
I can say that if you were relying on Volvo’s own detection system to save your life then you are in big trouble. The Volvo system is fragile and next to useless.
But the big spinning Uber thing on the roof should have picked up the pedestrian as it seems to spin 360 degrees like a ship’s radar.
However, I have noticed that the Tesla detection of anything going at 90 degrees to you is also a little suspect. But I am only going by how poorly it gets represented on the screen. Whether in real life my Tesla is fully alert I wouldn’t know (and hope I never have to find out)