Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Attempts to Kill Me; Causes Accident

This site may earn commission on affiliate links.
Long time lurker, first time poster. I have been trying to work with Tesla to resolve this issue out of the public domain, but they have been characteristically terrible and honestly don't seem to care. Over the last 3 weeks, I have sent multiple emails, followed up via phone calls, escalated through my local service center, and nobody from Tesla corporate has even emailed or called to say they are looking into this. One of my local service center technicians opened a case with engineering, which she said would take 90 days to review. I find that absurd, especially when Tesla is releasing new versions every 2 weeks. I think it's important for people to be extra cautious about which roads they engage FSD beta on, especially since Tesla seems to be ignoring my report entirely.

49548280121_4d220fbae7_c.jpg



This incident happened almost 3 weeks ago on Monday, November 22nd at around 6:15 in the evening, just shortly after the sun had set. I was driving my Tesla Model Y on a two-lane rural road and had FSD engaged. The car was still on version 10.4 at the time. It was a clear night, no rain or adverse weather conditions. Everything was going fine, and I had previously used FSD beta on this stretch of road before without a problem. There was some occasional phantom braking, but that had been sort of common with 10.4.

A right banked curve in this two lane road came up with a vehicle coming around the curve the opposite direction. The Model Y slowed slightly and began making the turn properly and without cause for concern. Suddenly, about 40% of the way through the turn, the Model Y straightened the wheel and crossed over the center line into the direct path of the oncoming vehicle. I reacted as quickly as I could, trying to pull the vehicle back into the lane. I really did not have a lot of time to react, so chose to override FSD by turning the steering wheel since my hands were already on the wheel and I felt this would be the fastest way to avoid a front overlap collision with the oncoming vehicle. When I attempted to pull the vehicle back into my lane, I lost control and skidded off into a ditch and through the woods.

I was pretty shaken up and the car was in pieces. I called for a tow, but I live in a pretty rural area and could not find a tow truck driver who would touch a Tesla. I tried moving the car and heard underbody shields and covers rubbing against the moving wheels. I ended up getting out with a utility knife, climbing under the car, and cutting out several shields, wheel well liners, and other plastic bits that were lodged into the wheels. Surprisingly, the car was drivable and I was able to drive it to the body shop.

Right after the accident, I made the mistake of putting it in park and getting out of the vehicle first to check the situation before I hit the dashcam save button. The drive to the body shop was over an hour long, so the footage was overridden. Luckily, I was able to use some forensic file recovery software to recover the footage off the external hard drive I had plugged in.

In the footage, you can see the vehicle leave the lane, and within about 10 frames, I had already begun pulling back into the lane before losing control and skidding off the road. Since Teslacam records at about 36 frames per second, this would mean I reacted within about 360ms of the lane departure. I understand it is my responsibility to pay attention and maintain control of the vehicle, which I agreed to when I enrolled in FSD beta. I was paying attention, but human reaction does not get much faster than this and I am not sure how I could have otherwise avoided this incident. The speed limit on this road is 55mph. I would estimate FSD was probably going about 45-50mph, but have no way to confirm. I think the corrective steering I applied was too sharp given the speed the vehicle was going, and I lost grip with the pavement. On the 40% speed slowed down version of the clip, you can sort of see the back end of the car break loose in the way the front end starts to wiggle as the mailbox makes its way to the left side of the frame.

Surprisingly, I somehow managed to steer this flying car through a mini-forest, avoiding several trees (although I did knock off the driver's side mirror). There is no side panel damage whatsoever. The bumper cover is ruined and the car sustained fairly severe structural/suspension damage, both front and rear suspension components.

Luckily, nobody was hurt (except my poor car). I could not imagine the weight on my conscience if I had been too slow to intervene and ended up striking that oncoming vehicle. Front overlap collisions are some of the most deadly ways to crash a car, and bodily injury would have been very likely.

I have a perfect driving record and have never had an at-fault accident in the over 10 years I have been licensed. The thought of filing an insurance claim and increasing my premiums over this incident makes me sick. I am considering legal action against Tesla, but I'm not going to get into that here. Just wanted to make everyone aware and hyper-vigilant about FSD. I thought I was, but then this happened. I am going to be much more careful about the situations in which I decide to engage it. There is too much at stake, it is not mature enough, and frankly, Tesla's apathy and lack of communication around this incident really concerns me, as both an owner and a road-user.


tl;dr: Be careful with FSD, folks. And if you get into an accident, hit the dashcam save button or honk your horn before you put it in park.



Display of a Tesla car on autopilot mode showing current speed, remaining estimated range, speed limit and presence of vehicles on motorway lanes” by Marco Verch is licensed under CC BY 2.0.
 
Last edited by a moderator:
I don't think that is the issue.

The problem is, how to get consent from the school kids and their parents if THEY want to be part of your testing?
Just wait for Tesla Bot. Tesla Bot will provide you with unlimited opportunities to practice scaremongering. Imagine, in the night, the Tesla Bot turns on its red eyes, takes a knife and sneaks into child bedroom accidentally mistaking children for beef patties. I can already imagine Reuters and AP horror stories about Tesla Bots run amok.
 
  • Funny
Reactions: Silicon Desert
The problem is, how to get consent from the school kids and their parents if THEY want to be part of your testing?

Do we need to give consent when other human student drivers get on the road with an experienced driver teaching the student how to drive? No, so why does it need to be different if the student driver is a computer program instead of a human?
 
Do we need to give consent when other human student drivers get on the road with an experienced driver teaching the student how to drive? No, so why does it need to be different if the student driver is a computer program instead of a human?
Of course we do as a society. That’s why there are many rules regarding student drivers. There are also rules about autonomous vehicle testing. Obviously the rules are different because they’re completely different things.
No one with the mental disabilities of FSD Beta would ever be allowed to drive a car if you want to go with that analogy.
 
Of course we do as a society. That’s why there are many rules regarding student drivers. There are also rules about autonomous vehicle testing. Obviously the rules are different because they’re completely different things.
No one with the mental disabilities of FSD Beta would ever be allowed to drive a car if you want to go with that analogy.
But to the question being asked in the OP is consent of the public beyond what is required under law. Under law in the US there is no need for consent of the public to test L2 systems, nor is there consent even needed to test L3+ vehicles (although depending on what state, you may need a permit and other paperwork). The situation is the same as learner's permits, where the public has no say in who gets them.

There are things however that do require public consent, for example installing a new cell tower (in my city everyone in the neighborhood gets sent a letter, allowing for protests and public comment before it is installed). There is no such thing for car testing (a bunch of companies are testing here and they never sent any notice to the neighborhood).
 
Last edited:
  • Like
Reactions: WattPwr and alexgr
Do we need to give consent when other human student drivers get on the road with an experienced driver teaching the student how to drive? No, so why does it need to be different if the student driver is a computer program instead of a human?
There is quite the difference I would argue. Humans, in this regard student drivers, are reasonable predictable. An experienced teacher, will probably know in advance when things are starting to lead up to an incident. Leading the student to an unknown situation not yet trained on, Traffic situation, communication and observation of the subtle signals from student etc.

A computer car is, as we see in the videos, totally unpredictable,
and will do the "worst thing at the worst time" as Tesla approximately writes. None of us know the real capabilities of the car. This car is "supervised" by an amateur (biased tech nerd, newly rich on TSLA without any worries in the world, anything, hating any criticism and can't see any weaknesses in the product - not all but some). Some even need it to manage every situation because their economy depends on it.

Other autonomous car companies do report their progress, so there is a societal oversight.

What we can't consent to is DUI and speeding, but society has traffic rules and a police and courts to deter and to punish.
 
I know about a certain group in Germany once, doing lethal experiments on both chlldren and adults without their consent.


Godwin's law, short for Godwin's law (or rule) of Nazi analogies,[1][2] is an Internet adage asserting that as an online discussion grows longer (regardless of topic or scope), the probability of a comparison involving Nazis or Adolf Hitler approaches 1.[2][3] In less mathematical terms, the longer the discussion, the more likely a Nazi comparison becomes, and with long enough discussions, it is a certainty.

ps : I've been on internet "forums" for 20+ years - but this got to be one of the worst comparisons.
 
Last edited:
But to the question being asked in the OP is consent of the public beyond what is required under law. Under law in the US there is no need for consent of the public to test L2 systems, nor is there consent even needed to test L3+ vehicles (although depending on what state, you may need a permit and other paperwork). The situation is the same as learner's permits, where the public has no say in who gets them.

There are things however that do require public consent, for example installing a new cell tower (in my city everyone in the neighborhood gets sent a letter, allowing for protests and public comment before it is installed). There is no such thing for car testing (a bunch of companies are testing here and they never sent any notice to the neighborhood).
A good clearification. I would guess though that most L2 testing is done vigorously by professionals on closed premises.

My previous post laid out some issues with the bias FSD beta owners could have, making the less suited.

And to remind everyone, my comment was an opinion to the owner who thought fsd testing was infact with a risk of dangers, and asked other drivers to opt out if they were not ready for that risk.

This leads to the point we disagree on; what score in a risk assessment should FSD beta have in different scenarios, compared to other areas of risk that the society has accepted?
 
..., how to get consent from the school kids and their parents if THEY want to be part of your testing?
There is a risk vs reward that has to be taken into consideration. I think most people think the reward outweighs the risk. If things go according to plan, then more lives will be saved than lost. Fewer injuries by proceeding than by not proceeding. Fewer children harmed by proceeding with testing than taking a more cautious approach. In a future quarter Tesla will publish that FSD beta miles are safer than non FSD beta miles. Hopefully that future isn't far off.
 
  • Like
Reactions: Silicon Desert
There is a risk vs reward that has to be taken into consideration. I think most people think the reward outweighs the risk. If things go according to plan, then more lives will be saved than lost. Fewer injuries by proceeding than by not proceeding. Fewer children harmed by proceeding with testing than taking a more cautious approach. In a future quarter Tesla will publish that FSD beta miles are safer than non FSD beta miles. Hopefully that future isn't far off.
I certainly hope so. I do believe though that they should reduce or stop increasing the number of testers until the system improves. High number of testers do increase the chance of a lethal accident. Hopefully no innocents are harmed when it happens.

Cudos to all Tesla FSD owner sacrificing themselves for mankind or shareholders best!
 
How should we handle this ?

Here, we took the driver's license and sentenced the driver to prison. Society has a system to deter drivers.

How do we handle FSD's sudden and well known failures, when suddenly 1000s of excellent safe driver suddenly are safeguarding a system that introduce risk?
 
OP could be very rich. He's sitting on something that could potentially kill Tesla's FSDBeta program.

How many EXTREMELY wealthy people have beefs with Elon and would like to see Tesla implode? It shouldn't be hard to monetize and cash in on OP's good luck. It is Christmas, ya know. ;)

"Mr Jeff Bezos,

How would you like to embarrass Mr Elon Musk and damage his net worth? Tesla's valuation is partly based upon their FSD program. I'm in possession of evidence that can help destroy Tesla's FSD program and will sell it to you for $1M cash.

Signed,
OP

p.s. This isn't like unintended acceleration and I'm totally not lying. No refunds. No guarantees."

Time for the OP to contact NTHSA and close this derailed thread.

 
This one has stuck in my mind since I originally read it. While I feel bad for the accident I can't accept that FSD Beta is the primary culprit. As has been mentioned many times on this thread it has been (rightfully) pounded into us that this is BETA and could go badly at any moment. I think that Tesla was quite astute in how to select owners to participate. Not just in the fact that they wanted safe drivers, but because they picked the ones that 1) cared enough to change their behavior and/or 2) "gamers" that figured out how to hack the system to get a good score. Both have seemingly acted in a fashion that that demonstrates the desire to help debug FSD.

One of the best descriptions of the FSD beta program participants that I have read is that we are "test pilots". One wrong move or lack of attention in the wrong moment could result in bad things happening. Those of us that have always used AP extensively know things that it does well and things that it does not. Perhaps better put, we know when it is likely to do stupid stuff that may cause a crash. This applies to FSD beta as well. I know that if driving on a well marked road or doing "simple" turns it will likely not fail. On the other hand I also know the opposite. When I am in close proximity to other cars traveling in the same or opposite directions I heighten my attention. Double that attention on 2 lane roads, even more so at night. I have taken over many times in these circumstances and if a crash does happen I would have to accept some degree of responsibility.

Thanks for reading,

Perry
 
In several years of belonging here this is the only post by the OP and they have not responded at all. Video not available. I'm gonna call BS on the OP.
You can still find a copy of the video here:
Code:
https://web.archive.org/web/20211210213114/https://www.youtube.com/watch?v=7VhrG-7SBZg

Whatever the circumstances, even if OP was completely telling the truth, if the OP got a lawyer involved as claimed he was planning to, I think the first thing any decent lawyer would do is tell him to pull the video off and to cease communication with the public. It's also possible he settled with Tesla and got an NDA (which may have been the goal of publishing the video).
 
Last edited: