Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla employee killed in crash involving FSD?

This site may earn commission on affiliate links.
Who cares what FSD did. If you’re intoxicated and crash, we know why that happened.
It would be good if driver assistance systems prevented drunk driving deaths, whether engaged or not. Remember many sober innocent people die in such accidents.

So what the system is capable of is relevant.

Of course, this is a terrible example, because it sounds like even Tesla does not know what was engaged.

All we know is that the system is not capable of this sort of accident avoidance at the current time, whether engaged or not.
 
It would be good if driver assistance systems prevented drunk driving deaths, whether engaged or not. Remember many sober innocent people die in such accidents.

So what the system is capable of is relevant.

Of course, this is a terrible example, because it sounds like even Tesla does not know what was engaged.

All we know is that the system is not capable of this sort of accident avoidance at the current time, whether engaged or not.
But this was a long time ago, possibly before the FSD beta was available.
 
  • Disagree
Reactions: AlanSubie4Life
But this was a long time ago, possibly before the FSD beta was available.
2022
FSD Beta was available from October 2021 or so.

No idea what was in use if anything.

Presumably he had access to FSD Beta but there is some lack of clarity in the article. It’s not clear whether they understand the distinctions. It’s not directly addressed I don’t think (only quickly skimmed).

They conflate FSD Beta with FSD, and at the time FSD and FSD Beta were different. These days they are largely the same, though in Autosteer (Beta) there are features which are only available to FSD buyers. So there is still crossover which makes it unclear exactly what is being discussed.
I was hopeful the video evidence above would confirm that FSD Beta was available to this owner, but it does not appear it can be used to address this topic. Video was nearly certainly taken before single stack in late 2022, so visualizations were apparently identical on the freeway.
 
Last edited:
Given this happened almost 2 years ago one would have to assume Tesla has pulled all the data logs and knows EXACTLY what happened and has the family "sued Tesla" if AP/FSD was at fault?
According to Tesla, Tesla does not have access to the logs. There is a video at the end of the article with the investigating officer, which covers this. The family has not sued since they cannot get a lawyer to take the case since Ohain was very drunk.

“At the time, Tesla had just introduced Full Self-Driving, and would eventually release it to a wider group of owners who had been monitored by the carmaker and declared safe drivers. Like many Tesla employees, von Ohain received the feature — then a $10,000 option — free with his employee discount, according to Bass and a purchase order reviewed by The Post.”

So he did have FSD, but that was before wide release of Beta so he would have had to qualify with the Safety Score method in 2022. So unclear on that front what was available - the above paragraph does not cover what actually happened regarding Ohain qualifying via safety score.

“However, Tesla did report the crash to the National Highway Traffic Safety Administration. According to NHTSA, Tesla received notification of the crash through an unspecified “complaint” and alerted federal authorities that a driver-assistance feature had been in use at least 30 seconds before impact. Because of the extensive fire damage, NHTSA could not confirm whether it was Full Self-Driving or Autopilot.”

So it was reported but sounds like Tesla doesn’t know what was in use; they don’t have the data nor do they have any way to access it (see the video).

It’s pretty weird to me that this article does not dive into the details of exactly how one gets access to FSD Beta in 2022 and whether Ohain had gone through that process - it simply ascertains that he had purchased FSD which is not the same thing. These stories never quite capture all the relevant information. It’s annoying.

My guess is that he was using AP on a curvy road, based on the described behavior: FSD does not frequently depart the road, which is what was happening. It only occasionally does so. But it is just a guess and we will never know, since the Washington Post refuses to investigate that point (it is knowable at least from witness/family interviews and a review of whether the FSD Beta admission email was ever received):

“The car’s driver-assistance software, Full Self-Driving, was struggling to navigate the mountain curves, forcing von Ohain repeatedly to yank it back on course.

“The first time it happened, I was like, ‘Is that normal?’” recalled Rossiter, who described the five-mile drive on the outskirts of Denver as “uncomfortable.” “And he was like, ‘Yeah, that happens every now and then.’””

This just sounds like AP to me (also known incorrectly as Full Self-Driving). It is possible that Ohain had purchased FSD for free but did not have access to FSD Beta (a very common scenario in 2022).

Regardless, we can hope in the future that all vehicles will be able to avoid this sort of accident, with background safety features with capabilities exceeding FSD Beta capabilities today. Avoiding collisions and uncommanded road departures by drunk and sober drivers, in the background, without false positives, is an exceedingly difficult “Holy Grail” but would be a huge step forward for vehicle safety - and it wouldn’t even necessarily require L2 or L3 to work reliably (unclear)! One day.

I guess I should cancel my WaPo subscription, or at least file a complaint about incomplete reporting.
 
Last edited:
2022
FSD Beta was available from October 2021 or so.

No idea what was in use if anything.

Presumably he had access to FSD Beta but there is some lack of clarity in the article. It’s not clear whether they understand the distinctions. It’s not directly addressed I don’t think (only quickly skimmed).

They conflate FSD Beta with FSD, and at the time FSD and FSD Beta were different. These days they are largely the same, though in Autosteer (Beta) there are features which are only available to FSD buyers. So there is still crossover which makes it unclear exactly what is being discussed.
I was hopeful the video evidence above would confirm that FSD Beta was available to this owner, but it does not appear it can be used to address this topic. Video was nearly certainly taken before single stack in late 2022, so visualizations were apparently identical on the freeway.
Really?

And then you post "“At the time, Tesla had just introduced Full Self-Driving, "
 
Really?

And then you post "“At the time, Tesla had just introduced Full Self-Driving, "
The idea here is to establish facts.

You said it was possible this was before FSD Beta was available. This is not possible, since FSD Beta WAS available.

I have clarified that it was available (at this time via Safety Score qualification and eventual (slow) admission).

I doubt FSD Beta was engaged during or before this collision. And I doubt Ohain had access to it (he only owned FSD, so without further effort only had access to AP/Autosteer and Stop Sign/Traffic Light Control and Green Light Beep). Just based on the balance of the evidence. I may well change my mind about that based on additional evidence.
 
The idea here is to establish facts.

You said it was possible this was before FSD Beta was available. This is not possible, since FSD Beta WAS available.

I have clarified that it was available (at this time via Safety Score qualification and eventual (slow) admission).

I doubt FSD Beta was engaged during or before this collision. And I doubt Ohain had access to it (he only owned FSD, so without further effort only had access to AP/Autosteer and Stop Sign/Traffic Light Control and Green Light Beep). Just based on the balance of the evidence. I may well change my mind about that based on additional evidence.

But the absolute most important thing is that whatever it was then, isn't what it is now and all the rules have changed.

So completely irrelevant news filler.
 
But the absolute most important thing is that whatever it was then, isn't what it is now and all the rules have changed.

So completely irrelevant news filler.
Perhaps.

I think if FSD Beta were engaged (which I strongly doubt) before or during the accident, that would be notable. It’s not really that important that it has improved since then.

But anyway there’s no information to determine this in the article.

Likely just over reliance on the capabilities of AP. And being very drunk.
 
  • Like
Reactions: DrGriz
Here's another article with a bit more info (and a large amount of sensationalism/misinformation)

The important bit though is this:
1707861298442.png


Here's a google maps link to the crash location

Look to me like the car made no attempt to turn and went straight off the road. I wonder if the driver was using Autopilot (or maybe FSD) and tugged the wheel too hard, putting the car into TACC mode. The (drunk) driver thought the car was still steering when it wasn't and the car went straight off the road.

Another thing they said in the video attached to the link at 0:55 or so, there's a recording of what I think was a responding officer saying "it just drove straight off the road at 70 miles per hour". Not sure if the cops really knew the speed (or even how they would've), but that would be interesting if true given there's a speed limit sign 100ft after the crash location that's marked 40 mph.
 
How about a news story about every person who fell asleep and Tesla's AP/FSD prevented an accident?

I know of one coworker who did exactly that.

I also know of another coworker who wasn't driving a Tesla and got into a severe accident.

Better yet, instead of anecdotals why can't Tesla be more transparent about AP/FSD accident statistics? If it works as well as they claim what do they have to hide?
 
  • Like
Reactions: DanCar
How about a news story about every person who fell asleep and Tesla's AP/FSD prevented an accident?

I know of one coworker who did exactly that.

I also know of another coworker who wasn't driving a Tesla and got into a severe accident.

I understand your point but these are false equivalencies, in my opinion. If a life is lost due to the fault of the system, it doesn't get "redeemed" by someone else's live being saved. It's not a zero sum game. In the cases you describe, the system simply worked as promised. You expect it, based on Tesla's marketing and publicly made statements, to work in such way. You don't expect it to run into a solid wall, for example, as has been a case with one tragic incident.

To look at it another way... If a jet crashes, killing 300 people on board, no one says "well, but that jet carried 2,000,000 people last year without a single fatality." No, you investigate the cause and address the issue transparently and openly with the goal of preventing further loss of life.
 
This was my first question about this too.

There is a video in the article. Can someone identify whether he is using FSD Beta in that video based on the appearance of the screen? Obviously there is the complexity of single stack (this was on the freeway, and the visualizations certainly do not look the way FSD does today on the freeway). But I seem to recall blue tentacle meaning FSD Beta and blue lane lines meaning AP...was that the case even on the freeway, before single stack? Anyway, I don't remember the details before and after single stack release about how it looked on the freeway and whether this video was taken before or after single stack, so putting it out here where I am sure someone can recall the exact version being used.

Anyway if the video capture below actually shows him using FSD Beta, then I think it's reasonable to assume that if the feature was engaged at the time of the crash, he was using FSD Beta, not AP (it's not really common to switch back and forth).

Unfortunately, based on the story it sounds like we'll never know whether any assistance feature was engaged, since according to Tesla no data connection existed at the time of the accident, and the car was burnt badly enough it was impossible to determine from the vehicle itself. That's somewhat inconsistent with other information in the story (where it is misstated (I think) that FSD was engaged at least 30 seconds before the crash - that's not the criteria), but maybe the complaint mentioned in the story just alleged that that was the case.

View attachment 1017949

Yeah, that's NoA.

FSD Beta at the time (even on freeway) = New "FSD" visuals + Blue Tentacle

NoA = Old "FSD/AP" visuals + blue tentacle

AP = Old "FSD/AP" visuals + lane lines
 
  • Informative
Reactions: AlanSubie4Life
I understand your point but these are false equivalencies, in my opinion. If a life is lost due to the fault of the system, it doesn't get "redeemed" by someone else's live being saved. It's not a zero sum game. In the cases you describe, the system simply worked as promised. You expect it, based on Tesla's marketing and publicly made statements, to work in such way. You don't expect it to run into a solid wall, for example, as has been a case with one tragic incident.

Sure, but everyone knows that AP and FSD will never ever be perfect, and today, they are both driver assist systems.

You say you don't expect it to run into a wall, but it very well might, but that doesn't mean anything because, once again, it's never ever going to be perfect.

This is a game of whether you want to throw the baby out with the bath water.

It's another story if it runs into walls, and Tesla does nothing to mitigate that. And we know that Tesla is a safety-oriented company and does do its best to make AP and FSD safer.
 
Yeah, that's NoA.

FSD Beta at the time (even on freeway) = New "FSD" visuals + Blue Tentacle

NoA = Old "FSD/AP" visuals + blue tentacle

AP = Old "FSD/AP" visuals + lane lines
Thanks.
Yeah, so it is not possible to say from that whether he had FSD Beta access. Since he was on the freeway he was using NOA which is all that was available at the time.

But as discussed I think it is unlikely FSD Beta was in use. The WaPo could have investigated further and proven it was available (or not) to this owner. But they did not. At a minimum that is needed before speculating further.

Poorly researched article since it did not uncover a key KNOWABLE piece of info needed (was it available).

What it did determine was that what was in use at the actual time of the accident is not knowable. But would be good to reduce the set of possibilities.
 
Last edited:
  • Like
Reactions: DrGriz
I'm reading this article. The driver was drunk. I'm not saying "tipsy" or had "one or two." He had alcohol level of 0.26, which anyone who has ever played around with a breathalyzer at a college party 🤐 would know is a s***faced drunk. Tesla's improperly named "full self driving" system is certainly not that, but in this case, the story starts and ends with 0.26 blood alcohol level, in my opinion. Any attempt to shift the blame to Tesla is absurd. Again, in this particular case.