Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

[UPDATED] 2 die in Tesla crash - NHTSA reports driver seat occupied

This site may earn commission on affiliate links.
No. However, there are a number of Youtubers who show that their Autopilot can continue to function without any driver in the driver's seat.

Since Tesla has already programmed that Autopilot will not work with the seatbelt unbuckled, it just needs to add one more factor about the weight and not just the seatbelt status only.

Not many Autopilot testers die but the most recent incident is all over the national news. Infrequent but that's bad enough!

Yup.

 
We do have to remember that there are two real-life families, of probably very fine character, who have lost their fathers/husbands in a horrible accident. Not far away and forgettable but only steps from their homes. So without ruling anything out, I think it's proper to put various nefarious and non-evidence-based theories (murder, manslaughter and flight, suicide pact) extremely low on the probability chart.

I agree with this.

I feel like the order I listed them is both respectful to the people who died, and is the most realistic given my own world view. Except I would rank mechanical failure or electrical glitch higher than foul play. In terms of a non-evidence based list as there is no evidence at this point in time.

if I was the one that died I wouldn't want to be accused of playing silly games trying to get AP activated. Its okay for me to die as a result of doing something I felt like I had control over like driving fast versus getting killed while doing something embarrassingly dumb like getting out of the drivers seat while on AP. You know you died in a dumb way when congress is investigating your accident as an excuse to write nanny laws.

Foul play is always something to be at least considered with a single vehicle fatality accident where how it happened seems odd. The lack of a driver makes it very odd.
 
  • Like
Reactions: MikeyC
I agree with this.

I feel like the order I listed them is both respectful to the people who died, and is the most realistic given my own world view. Except I would rank mechanical failure or electrical glitch higher than foul play. In terms of a non-evidence based list as there is no evidence at this point in time.

if I was the one that died I wouldn't want to be accused of playing silly games trying to get AP activated. Its okay for me to die as a result of doing something I felt like I had control over like driving fast versus getting killed while doing something embarrassingly dumb like getting out of the drivers seat while on AP. You know you died in a dumb way when congress is investigating your accident as an excuse to write nanny laws.

Foul play is always something to be at least considered with a single vehicle fatality accident where how it happened seems odd. The lack of a driver makes it very odd.

Yeah, I feel like we've exhausted all the likely and unlikely scenarios and now we just have to wait. And that's not to say it can't be something we never even thought of: like a poorly placed floormat that caused the throttle to stick. Rather than an indictment of AP, this whole thing may end up being a lesson in what officials should say after a preliminary look at a crash scene. Given the circumstances, officials should have known there would be a deeper investigation so it might have been prudent to point out the locations of the bodies and mention further investigation is needed. Then the headlines might read "Two die to fire in front and rear passenger seats of crashed Tesla, prompting questions about crash circumstances".

Mike
 
From Tesla crash report. Couple of interesting things
- Crash data is collected when "an airbag or other active restraint deployed"
- Autopilot was deactivated within 5 seconds before a crash

So, we can guess that the Woodland crash
- resulted in airbag/active restraint deployment
- AP was not activated in the last 5 seconds (though Musk doesn't mention the 5 second rule)

Considering the drive itself was just 10 seconds are so - we can be quite confident that there was no hacking to get AP running in this instance. Or if it did, lasted just a couple of seconds before disengaging.

https://www.tesla.com/VehicleSafetyReport

We collect the exact amount of miles traveled by each vehicle with Autopilot active or in manual driving, and do so without identifying specific vehicles to protect privacy. We also receive a crash alert anytime there is a crash that is correlated to the exact vehicle state at the time. This is not from a sampled data set, but rather this is exact summations. To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before a crash, and we count all crashes in which the crash alert indicated an airbag or other active restraint deployed. In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated.
 
From Tesla crash report. Couple of interesting things
- Crash data is collected when "an airbag or other active restraint deployed"
- Autopilot was deactivated within 5 seconds before a crash

So, we can guess that the Woodland crash
- resulted in airbag/active restraint deployment
- AP was not activated in the last 5 seconds (though Musk doesn't mention the 5 second rule)

Considering the drive itself was just 10 seconds are so - we can be quite confident that there was no hacking to get AP running in this instance. Or if it did, lasted just a couple of seconds before disengaging.

https://www.tesla.com/VehicleSafetyReport

I read the information at that link a bit differently. They are talking about which crashes they consider when evaluating AP safety so they say to be conservative, they "count" all crashes where AP was either on at the time of the crash, or had at least been on no more than 5 seconds before the crash had occurred (indicating someone may have tried to take control at the last second). I take this to mean they are able to get crash data for any collision but that for the statistics they report (regarding the safety of AP), they only include crashes where AP had been on very close to the time of the accident and ones where airbags went off (indicating it was more than just a minor tap).

Mike
 
Is it possible they haven't noticed AP was disabled and one wanted to show the other how the car can take the turn on its own at higher speeds ?
My guess is the doctor would have been able to figure out in 2 years that AP doesn't work on his home street. But if he was drunk - who knows ...

I think the simplest explanation continues to be flooring and losing control and moving to a different seat after the crash to get out.

ps : A doctor would probably say remain still until help comes if someone gets a fracture. I guess, one more thing to consider.
 
  • Like
Reactions: mrElbe
I read the information at that link a bit differently.
How so ?

The info I get from that is
- they gather reports on all crashes
- they know whether AP was on up to 5 seconds before the crash.

So, in the case of the Woodland accident
- they gathered the crash report
- they would know whether AP was on upto 5 seconds (at least) before the crash
- Musk has the above information

So, given all this, since Musk tweeted no AP at the time of the impact ... we can reasonably assume
- no AP for upto at least 5 seconds before crash. No point in Musk saying AP not I engaged if he knew it was engaged 3 seconds before crash, for eg. The info will come out anyway.
- Since the whole drive took only 10 to 15 seconds, chances of engaging AP are low if in the last 5 seconds it wasn’t engaged
 
How so ?

The info I get from that is
- they gather reports on all crashes
- they know whether AP was on up to 5 seconds before the crash.

So, in the case of the Woodland accident
- they gathered the crash report
- they would know whether AP was on upto 5 seconds (at least) before the crash
- Musk has the above information

So, given all this, since Musk tweeted no AP at the time of the impact ... we can reasonably assume
- no AP for upto at least 5 seconds before crash. No point in Musk saying AP not I engaged if he knew it was engaged 3 seconds before crash, for eg. The info will come out anyway.
- Since the whole drive took only 10 to 15 seconds, chances of engaging AP are low if in the last 5 seconds it wasn’t engaged

Applying how Tesla reports crashes WRT autopilot to the crash in question, you have to make a lot of assumptions. First you have to assume they have a crash report. They may not. If the cloud upload was disabled at the instant of the crash due to damage, the latest information they have may be 10 seconds old, 30 seconds, or even longer depending on upload speed, cell signal, etc.. I'm sure there's some lag as nothing uploads instantly. So that might be the reason Elon said "retrieved so far": because the crash report that sends video from the cameras and other data after the crash may not be retrievable. But Tesla could download data from before the crash using the specific VIN and may have noticed an earlier part of the drive and noted AP wasn't being used at that point. Hence the "so far" comment. We just don't have that information.

Mike
 
My guess is the doctor would have been able to figure out in 2 years that AP doesn't work on his home street. But if he was drunk - who knows ...

For my own cars, some drivers just keep missing the cues that the TACC/Austeer was not on after they purposefully turned it on or unintentionally forgot to turn it on. And as a passenger, I had to remind them that either both were not on or only the TACC was on and not Autosteer. After 4 years, they still would get all offended "I know I know, this is not my first drive".
 
  • Like
Reactions: mrElbe
The really good news is that Consumer Reports has developed a solution:
 

Attachments

  • Screen Shot 2021-04-24 at 8.11.32 PM.png
    Screen Shot 2021-04-24 at 8.11.32 PM.png
    321.5 KB · Views: 90
But I also think it is true that Tesla has encouraged some of this stupidity by promoting its driver assistance tech - named Autopilot and Full Self Driving - in a sometimes exaggerated manner, fine print disclaimers notwithstanding.
[/URL]

I have to agree with this. I took my car to a car show yesterday and the most asked question was- "How long does it take to charge?", the second.... "Does the car really drive itself?"

This accident we are discussing here almost kept me from going to the show, as we are in Texas and only about 100 miles from where the incident happened. I was worried that it would be ALL I hear about. But... it was only mentioned two, maybe three times directly to me and once I heard several guys in earshot talking about it. I felt Han Solo-ish explaining AP and FSD to them- "That's not how The FSD works..." LOL
 
Last edited:
  • Like
Reactions: jsmay311
I'd have to agree WRT the naming conventions. Even though autopilot has never meant "pilot free", the general public gets the wrong idea just from the name. I would have preferred "copilot" but Ford has now taken that one. And it's kinda too late now anyway: I doubt Tesla will back down unless they are forced to because it's pretty much admitting defeat. And "Full Self Driving" just makes matters worse.

Mike
 
  • Like
Reactions: jsmay311
To me it seems ridiculous to get so wrapped around the axle about the name “auto pilot“.
That term has been around for over 30 years obviously rooted in aviation. And today’s Tesla AP does exactly as the name suggests just like in airplanes and helicopters, and in keeping with the definition. It wouldn’t make a hill of beans what you call it, people are going to do stupid things.
Im tired of hearing that some poor new owner crashed his or her car and blamed it on the name of the system. We are responsible for our actions and learning how to use the equipment. Maybe read the owners manual or watching the included how to videos would be a good start.

sorry ..had to get that off my chest.......
please carry on with the crash theories.
 
I'd have to agree WRT the naming conventions. Even though autopilot has never meant "pilot free", the general public gets the wrong idea just from the name. I would have preferred "copilot" but Ford has now taken that one. And it's kinda too late now anyway: I doubt Tesla will back down unless they are forced to because it's pretty much admitting defeat. And "Full Self Driving" just makes matters worse.

Mike
I think that Tesla needs to rename it "Not An Autopilot"
 
To me it seems ridiculous to get so wrapped around the axle about the name “auto pilot“.
That term has been around for over 30 years obviously rooted in aviation. And today’s Tesla AP does exactly as the name suggests just like in airplanes and helicopters, and in keeping with the definition. It wouldn’t make a hill of beans what you call it, people are going to do stupid things.
Im tired of hearing that some poor new owner crashed his or her car and blamed it on the name of the system. We are responsible for our actions and learning how to use the equipment. Maybe read the owners manual or watching the included how to videos would be a good start.

sorry ..had to get that off my chest.......
please carry on with the crash theories.

even if the name were misleading, people would test autopilot before relying on it. certainly, nobody is going to hang weights from the steering wheel, turn on autopilot, and climb to the passenger’s seat without testing the system first.

I am sick of hearing all the criticism of autopilot and FSD. It is all moronic in my view.