Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Any Actual Crashes with FSD?

This site may earn commission on affiliate links.
Like most other folks here, I'm really super-vigilant while test driving this FSD Beta. If you're like me, you've grown more confident with this system as you've gotten more experience with its strengths and weaknesses. We're all learning. Of course it has problems, and that's why we're testing it -- to improve it. Frankly, I've driven a number of production cars that have had safety issues of various types, and I'll bet that every one of you have too. As drivers, we learn, and adapt to any car's weaknesses, and enjoy its strengths.

This advanced software, computing and sensor suite add an additional element of performance and safety that is really promising. It's teaching us to be better, more alert, and observant drivers. We're testing this software, and it is testing us as well. This is as serious as it gets, and the improvement of this software will lead to full autonomy -- for better or worse.

Automobile crashes happen every minute of every day, and the carnage on our highways is mostly caused by driver errors. We accept risk, and operate on a continuum of acceptable or unacceptable risks. FSD will never get intoxicated, fall asleep at the wheel, be distracted by a text message, or experience road rage. Many folks have posted that this is just like monitoring a student driver, and that is "spot on". Our cars are learning, and we must be good teachers, good mentors, and eventually good, honest judges of their performance.

To answer the OP's question, there surely have been "accidents" while driving on FSD Beta. Everyone knows that someone will die at some point in this process. Advancements in medicine, aeronautics, and technology are almost always accompanied by fatalities -- but those advancements ultimately make us safer in the long run. The NTSB understands that viscerally. A lot fewer people will die on the world's highways because of this engineering achievement and the breakthroughs that are happening right now.

I really do believe that we are test pilots in this new frontier. Test piloting is not easy, and it really makes you sweat sometimes, but this experience will surely make you a better, safer driver, and every test drive that we do will make our cars better too. That's good for everyone -- drivers, cyclists, and pedestrians alike.

Everyone has their fingers crossed that the regulators don't do something stupid. If this becomes as politicized as public health has become, we're in real trouble. There are so many variables at play. We stand on the threshold of an open door that's blowing in the political wind -- and may be slammed in our faces.

I also want to thank all of you who have offered your observations and suggestions to help me learn. I really want to be the best FSD Beta tester that I can become, and I hope that you do too.
 
Like most other folks here, I'm really super-vigilant while test driving this FSD Beta. If you're like me, you've grown more confident with this system as you've gotten more experience with its strengths and weaknesses. We're all learning. Of course it has problems, and that's why we're testing it -- to improve it. Frankly, I've driven a number of production cars that have had safety issues of various types, and I'll bet that every one of you have too. As drivers, we learn, and adapt to any car's weaknesses, and enjoy its strengths.

This advanced software, computing and sensor suite add an additional element of performance and safety that is really promising. It's teaching us to be better, more alert, and observant drivers. We're testing this software, and it is testing us as well. This is as serious as it gets, and the improvement of this software will lead to full autonomy -- for better or worse.

Automobile crashes happen every minute of every day, and the carnage on our highways is mostly caused by driver errors. We accept risk, and operate on a continuum of acceptable or unacceptable risks. FSD will never get intoxicated, fall asleep at the wheel, be distracted by a text message, or experience road rage. Many folks have posted that this is just like monitoring a student driver, and that is "spot on". Our cars are learning, and we must be good teachers, good mentors, and eventually good, honest judges of their performance.

To answer the OP's question, there surely have been "accidents" while driving on FSD Beta. Everyone knows that someone will die at some point in this process. Advancements in medicine, aeronautics, and technology are almost always accompanied by fatalities -- but those advancements ultimately make us safer in the long run. The NTSB understands that viscerally. A lot fewer people will die on the world's highways because of this engineering achievement and the breakthroughs that are happening right now.

I really do believe that we are test pilots in this new frontier. Test piloting is not easy, and it really makes you sweat sometimes, but this experience will surely make you a better, safer driver, and every test drive that we do will make our cars better too. That's good for everyone -- drivers, cyclists, and pedestrians alike.

Everyone has their fingers crossed that the regulators don't do something stupid. If this becomes as politicized as public health has become, we're in real trouble. There are so many variables at play. We stand on the threshold of an open door that's blowing in the political wind -- and may be slammed in our faces.

I also want to thank all of you who have offered your observations and suggestions to help me learn. I really want to be the best FSD Beta tester that I can become, and I hope that you do too.
Well said! I could not agree more.

The fact is that self-driving cars only need to be safer than human drivers to save lives, and that is a very low bar. Of course they will be held to a higher standard than that, which is natural, but as you say if it becomes politicized it will become a terrible mess. And in that case the US will once again surrender a potential strength to the Chinese and other foreign powers.
 
  • Like
Reactions: PianoAl
It would be illegal to make that DB “publicly searchable”.

If there is a crash, Tesla will report it.

Anything else is a conspiracy theory. You should try your conspiracy theories in Facebook/Meta. Mark loves them, I believe.
Actually, the standing order from the NHTSA intends to make the crash information publicly available, although certain sensitive information may be redacted:
Are the crash reports required by the Order publicly available?
NHTSA intends to make summary crash information it receives under the Order publicly available. NHTSA will process the information it receives under the Order, and then will begin making information publicly available on NHTSA.gov. By law, NHTSA may not publicly disclose certain information, including personally identifiable information (PII) (such as the identity of individuals involved in crashes) and confidential business information.
Standing General Order on Crash Reporting for Levels of Driving Automation 2-5 | NHTSA
Note there are specific requirements for what is reportable and less serious crashes have a longer grace period (15th of following month automakers receives notice of crash), so there will be a substantial delay even when NHTSA makes data available.
 
Actually, the standing order from the NHTSA intends to make the crash information publicly available, although certain sensitive information may be redacted:

Standing General Order on Crash Reporting for Levels of Driving Automation 2-5 | NHTSA
Note there are specific requirements for what is reportable and less serious crashes have a longer grace period (15th of following month automakers receives notice of crash), so there will be a substantial delay even when NHTSA makes data available.
This is the only way to definitively answer OP's question at this time. There doesn't yet seem to be any data publicly released by the NHTSA or Tesla on crashes except where Tesla/Elon have declared there have been no accidents on FSD. No known crashes have been examined either - with the exception of curb strikes AFAIK.

Tesla must report on any incident that they have been made aware of.

It is possible that the term "test reports" in NHTSA's Amended Order (pdf) Definitions§14 might indicate that if an FSD Beta user presses the snapshot button and/or submits an email to Tesla and/or disengages (thus sending notice to Tesla) that might indicate that Tesla has been sent notice from the participant that an incident of note has occurred. Tesla would have to investigate and report any incident.

ie: if you are involved in an incident and press the snapshot button that might constitute giving notice to Tesla. A follow-up email from you to the Tesla FSD team would definitely constitute providing notice to Tesla.

Certain vehicle sensor reports may also provide notice to Tesla of a reportable incident.

"Crash" from Definitions§5 may include a curb strike if that satisfies "physical impact between a vehicle and ... property that results or allegedly results in any property damage"

I hope they release some of the reports soon, it will be useful to compare all the different ADAS systems and track any trends.
 
Last edited:
Thanks. I am pretty clear ot anyone who asks that NOONE should expect FSD in a ny vehicle or make yet. I believe Ihave seen some ads from one car manufacturer (?) that a person could drive with their hands off--how dangerous. and irresponsible. I always maintain control of the car--and actually enjoy many aspects of it.
 
Actually, the standing order from the NHTSA intends to make the crash information publicly available, although certain sensitive information may be redacted:

...

I believe NHTSA has no authority to unilaterally flout privacy rules (including HIPPA, CCPA etc).

As I said, making the "crash db" searchable by Tesla would be illegal under CCPA and other laws.

There's plenty of existing NHTSA crash reports that are searchable by Tesla, California, and other search terms. Detailed information about the drivers injuries and location. As @stopcrazypp says only some personal information has been redacted. Nothing yet on ADS/ADAS reports.

https://crashstats.nhtsa.dot.gov/
 
Last edited:
Like most other folks here, I'm really super-vigilant while test driving this FSD Beta. If you're like me, you've grown more confident with this system as you've gotten more experience with its strengths and weaknesses. We're all learning. Of course it has problems, and that's why we're testing it -- to improve it. Frankly, I've driven a number of production cars that have had safety issues of various types, and I'll bet that every one of you have too. As drivers, we learn, and adapt to any car's weaknesses, and enjoy its strengths.

This advanced software, computing and sensor suite add an additional element of performance and safety that is really promising. It's teaching us to be better, more alert, and observant drivers. We're testing this software, and it is testing us as well. This is as serious as it gets, and the improvement of this software will lead to full autonomy -- for better or worse.

Automobile crashes happen every minute of every day, and the carnage on our highways is mostly caused by driver errors. We accept risk, and operate on a continuum of acceptable or unacceptable risks. FSD will never get intoxicated, fall asleep at the wheel, be distracted by a text message, or experience road rage. Many folks have posted that this is just like monitoring a student driver, and that is "spot on". Our cars are learning, and we must be good teachers, good mentors, and eventually good, honest judges of their performance.

To answer the OP's question, there surely have been "accidents" while driving on FSD Beta. Everyone knows that someone will die at some point in this process. Advancements in medicine, aeronautics, and technology are almost always accompanied by fatalities -- but those advancements ultimately make us safer in the long run. The NTSB understands that viscerally. A lot fewer people will die on the world's highways because of this engineering achievement and the breakthroughs that are happening right now.

I really do believe that we are test pilots in this new frontier. Test piloting is not easy, and it really makes you sweat sometimes, but this experience will surely make you a better, safer driver, and every test drive that we do will make our cars better too. That's good for everyone -- drivers, cyclists, and pedestrians alike.

Everyone has their fingers crossed that the regulators don't do something stupid. If this becomes as politicized as public health has become, we're in real trouble. There are so many variables at play. We stand on the threshold of an open door that's blowing in the political wind -- and may be slammed in our faces.

I also want to thank all of you who have offered your observations and suggestions to help me learn. I really want to be the best FSD Beta tester that I can become, and I hope that you do too.
I don't think we can accept one single death testing FSD. Especially third party injuries or deaths would be catastrophic.
 
  • Disagree
Reactions: mark95476
This crash story sounds bogus to me. Once you take back control of the car, it does NOT take it back from you. I am guessing the driver was NOT paying attention and let the car turn into the lane, then blamed it on the car rather than his inattention. Illustrates why this FSD beta should not be released to irresponsible drivers.
 
  • Like
Reactions: alexgr
This crash story sounds bogus to me. Once you take back control of the car, it does NOT take it back from you. I am guessing the driver was NOT paying attention and let the car turn into the lane, then blamed it on the car rather than his inattention. Illustrates why this FSD beta should not be released to irresponsible drivers.
I wonder if it was some sort of mode confusion. I know when I have disengaged FSD quickly using the wheel in an intersection, I am left in TAC, and it has caused a bit of confusion with what the car was doing as I didn’t have full control yet. Took my brain a second to figure out I needed to hit the brake to gain control. All while still navigating an intersection.

which is the reason I only use the beta at low speeds and in VERY light traffic. I think they should change the beta to fully disengage with the wheel, at least on city streets.
 
Even if there is an accident while FSD Beta is on, it will be blamed on driver error. Users are expected to intervene when a crash is imminent.

Users should intervene at the first sign of FSD beginning to do something unsafe, not wait until a crash is imminent.

People worried about too many crashes when FSD is released to more users have incorrect thinking/analytic skills. Sure, there will be accidents caused by people not paying proper attention but not as many s the within the general population of all cars. People already don't pay attention, even in cars without AP or FSD. That's why the accident rate is so high in the general population and the fact that the accident rate is 8 times lower in cars actively using AP means these systems make people pay MORE attention, on average, not less. This means there will still be accidents on FSD but at a lower rate than all other cars.
 
  • Like
Reactions: Michael_Vinson
Perhaps it depends on number of lives saved? If FSD statistically saves 10 lives including third parties, but 1 is killed, would that be acceptable or unacceptable?
What are you on? FSD saves lives??? lol yeah maybe in year 2055. News flash, we in 2021 with half baked FSD using training wheels. One of my friend just got pulled over tonight as the officers thought he was under the influence. "No I haven't been drinking sir. I have the car on FSD and it's trying to drive itself. My bad that the car was driving erratically and stopping 10 feet before the stop sign." ROFLMAO
 
Won’t Autopilot and FSD disengage quickly before any accident? Tesla counts accidents within 5 seconds of disengagement right? I’m not sure everyone uses the same methodology (highway patrol, media, drivers).

Both autopilot and FSD leave cruise control on when they disengage, right? That is what freaks me out the most. Especially easy to miss the two quick chimes with music on loud.
 
Last edited:
  • Like
Reactions: Terminator857
Perhaps it depends on number of lives saved? If FSD statistically saves 10 lives including third parties, but 1 is killed, would that be acceptable or unacceptable?
Interesting question, similar to some classic ethical problems widely discussed.

I think there is a difference to the answer if the system is being developed or finished. If one can document a huge reduction in casualties and injured it would be much different than "may happen" promises.

The acceptance would also depend on whether the new kind of casualty casualty being a tester, say a shareholder or an innocent kid or young mother.

Another axis is if we reduce deaths in group of alcoholics or speeding young men but increase deaths in the group "kids crossing on green".

In my country, 2/3 of fatal traffic accidents involve a drunk driver.

Remember Uber's tragic accident led directly to their autonomy failure. Society will probably not be kinder to Tesla & richest man in the world.