Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal accidents while on autopilot and driver's expectations

This site may earn commission on affiliate links.
I watched some of the video of the layer making the case against Tesla for that fatal autopilot accident that happened in March this year, in Florida I think. I didn't watch the whole thing, but IMO its pretty obvious that the tractor trailer driver was at fault for pulling out when he did not have the right of way. I suspect if you replaced the Tesla model 3 with some other kind of common car, like BMW, Honda, Toyota, or whatever, at least some of those cars would have had a collision as well with the trailer as well. I guess the question is that if autopilot was not enabled, would there still have been a crash?
Anyhow, If you conducted a survey of Tesla owners who have/use auto pilot, to see if they know exactly what auto pilot will do and what it won't do, a significant number probably won't know. How can you increase the education rate?
This might sound unrealistic at first, but I propose that Tesla should have an online class certification that needs to be done once every year, or when a big change is made. Make the course available online and also on the screen inside the car. Tie the certification to each driver profile. Disable autopilot for the driver if they do not complete the course within a month.
The autopilot is a very new thing, and you are going to have to force people to take some time to understand what it can do, and what it cannot do.
 
...Disable autopilot for the driver if they do not complete the course within a month....

First, let's establish who is the boss around here:

"There is only one boss. The customer. And he can fire everybody in the company from the chairman on down, simply by spending his money somewhere else." Sam Walton

An owner owns the car. A seller just can't take the ownership back for flunking a test.

But I agree that Tesla should be more transparent and post its progress on an Autopilot/FSD blog:

1) Autopilot Accidents and why and the road map on how to improve it.

2) Capabilities: such as whether simple summon can use 8 cameras yet...

Anyhow, Tesla will need to solve this collision scenario for FSD because there won't be a driver to manually override the system in this scenario.
 
Self-driving will be just that, self-driving, no user intervention.
Until then, everything will probably be "beta" or more specifically, the responsibility lies in the hands of a person, not the car. If the car decides to do a U-turn in the middle of the Interstate, it will still be a person's fault.
It doesn't matter if you can anticipate the car's action or not, it will still be your fault.

Training would probably confuse the issue even more. "But Tesla told me in class that the car would..." would become liability hell.
Honestly, many of the updates over the last year have caused the car to drive differently, although it has been a little stagnant over the few months. I've got a curve that I travel that it cold not do at all last year, but release after release it could finally get it, albeit a slow speed, and today it manages to make it at about 43 mph. It will do 44 mph, but gets upset and throws an alarm.

This is just a case of a lawyer trying to make some big bucks. Big company, new technology not completely trusted, $$$$$$
 
  • Like
Reactions: erik_k
I suspect if you replaced the Tesla model 3 with some other kind of common car, like BMW, Honda, Toyota, or whatever, at least some of those cars would have had a collision as well with the trailer as well. I guess the question is that if autopilot was not enabled, would there still have been a crash?

People do run under tractor trailers in regular cars. It did take the tractor trailer quite a bit of time to make it all the way across into the median for the left, so I imagine most drivers who are paying attention could have easily avoided the accident. It wasn't like the trailer pulled out at the last second. The Tesla would have hit the cab in that case. No evasive maneuvers were made by the Tesla driver.

The question is did AP make the driver more comfortable looking down at his phone (or whatever distracted him)? He engaged AP 10 seconds before the crash. Either he turned on AP and immediately had a medical event, or he turned on AP and took his eyes off the road. If he had a regular car with AP, would he have still looked away from the road? Maybe, maybe not. Maybe he thought turning on AP made it more safe to look down. Unfortunately we probably wont ever know what he was thinking.
 
I admit to flipping on EAP when I'm about to do something distracting on a highway, like read a text off my phone. Otherwise I normally don't use it. Would I have still done the distracting thing in my old car? Probably.

Guilty here too. I suspect some people leave their eyes off the road far longer with AP than they would try to get away with without tho. Someone on Reddit posted that they "regularly" turn on AP, close their eyes, turn and kiss their wife while driving. They described that as "pure joy" and I doubt they would do that in a non-AP car.

I know if I am futzing with the various nested menus on the car, or trying adjust the climate, I am looking down, up, down, up, down up, etc. Maybe on AP I am looking down for longer than up? I probably do. It would be an interesting study for sure.
 
I watched some of the video of the layer making the case against Tesla for that fatal autopilot accident that happened in March this year, in Florida I think. I didn't watch the whole thing, but IMO its pretty obvious that the tractor trailer driver was at fault for pulling out when he did not have the right of way. I suspect if you replaced the Tesla model 3 with some other kind of common car, like BMW, Honda, Toyota, or whatever, at least some of those cars would have had a collision as well with the trailer as well. I guess the question is that if autopilot was not enabled, would there still have been a crash?
Anyhow, If you conducted a survey of Tesla owners who have/use auto pilot, to see if they know exactly what auto pilot will do and what it won't do, a significant number probably won't know. How can you increase the education rate?
This might sound unrealistic at first, but I propose that Tesla should have an online class certification that needs to be done once every year, or when a big change is made. Make the course available online and also on the screen inside the car. Tie the certification to each driver profile. Disable autopilot for the driver if they do not complete the course within a month.
The autopilot is a very new thing, and you are going to have to force people to take some time to understand what it can do, and what it cannot do.

Read this article or at least the promoted comment at the end.
Tesla has a self-driving strategy other companies abandoned years ago
 
One problem is the name. Autopilot is a terrible name for a driver assist suite. Most drivers out there, and this includes Tesla owners, are not all over the forums, and reading/discussing all of the technical capabilities, and limitations of the tech in their car. The company says this is Autopilot, and it drives on the highway without you. Pretty easy to get very comfortable with it quickly and forget it is not FSD.

The other problem is the driver. He was using AP on a road it is not intended for (cross traffic) and Tesla does not for some reason prohibit. And then he used AP in a way it is not intended, by taking his attention away from the road.

Everyone on this forum knows that is a recipe for disaster.

Did he know the limitations of the system and simply disregard them? Bad on him. Did he not know the limitations of the system because Tesla does not provide any required education (to the OPs point) and then let's you freely use it everywhere it is not intended to be used? Bad on Tesla.
 
  • Like
Reactions: jlv1 and pgkevet
...Tesla does not provide any required education...

Elon Musk Says Tesla Accidents May Result from 'Experienced User' Error | Inverse

“One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people for some reason think that –- or some of the articles think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous,” Musk told investors. “It is the opposite case. When there is a serious accident, almost always –- in fact, maybe always, the case that it is an experienced user and the issue is more one of complacency. Like, they get too used to it.”
 
I would imagine any lawsuits against Tesla for Autopilot accidents would be a product liability claim, which typically comes in 3 flavours...

1. Manufacturing Defect

2. Design Defect

3. Failure to Warn

I don't really see any argument with respect to #1. Autopilot works as intended, so I would doubt any reputable lawyer would sue on that basis.

#2 and #3 would be the areas to contemplate.

For #2, you'd have to show that even though the system works as intended, it is de facto dangerous and there is a reasonable alternative design. For this, you'd typically have to have a qualified expert present a reasonable alternative. It might be that they argue, there is no alternative, any level of automation is unsafe, but I really don't see that argument getting much traction.

#3 seems to have the most teeth and this largely becomes a question of fact for a juror. Are the warnings associated with Autopilot and the Nag reasonable?

Lastly, anyone is pretty much free to sue anyone else, but again, I think that is the wrong way to look at it. The question is whether or not they can successfully sue someone/something. So, in the case of the death with the semi, first you'd look at the driver of the semi, next you look at the driver of the Tesla (for contributory negligence) and finally you look at the respective car manufacturers. (That's a bit simplistic - especially in States with joint and several liability, but you get the idea).

I was having this debate with an NYPD cop just this weekend who was telling me how unsafe my Tesla is on Autopilot and how it should be illegal. My response was simple... Autopilot should not be measured against perfection, rather the measure should be "is it safer than the average driver". So, for example, if autopilot's record on safety is X accidents per 1 million miles and the average of human accidents is greater than X per 1 million miles, then autopilot is worth it, if not, then it doesn't work.
 
As I understand what authorities have preliminary determined (final report may not be released for a year still), they stated that the driver of the semi pulled out and then slowed down blocking the roadway to oncoming traffic. Didn't sound to me like the Tesla or any other car could have driven around the vehicle. Maybe they could have slowed down (not sure about his speed and distance to avoid hitting the truck, result may have been the same in any event) or run off the road trying to avoid; but until the report gets released, I'm still of the opinion that this accident was the result of the truck driver pulling out without yielding to oncoming traffic which should have been visible to him on his driver's side window if he looked to his left. Especially with the Model 3's lights on.

As to how much the driver of the Tesla beared responsibility don't think we can say yet if we ever can. Won't know the reason why he apparently took no action, he could have been sneezing, dropped some coffee and looked down, dozed off, don't know. It is curious why he had just turned on AP at that point in his travels, maybe he wanted to make a phone call or check email. This was a section of highway with cross traffic, and being a route that he supposedly took every day to work, he knew that. It was not an AP approved section of highway. NTSB also reported he was driving 68mph in a 55mph zone, and he would have set that speed. Perhaps driving slower would have bought him time to react if he did see the trailer in front of him at some point.

But just like in the Mt. View case, how does a driver let the car, any car for that matter, drive them into a solid object when they had time to react? Only two answers to that I can think of--not paying attention or suicide. I can see the family suing the truck driver and the trucking company of course gets toss in too due to bigger pockets and insurance coverage they would have, but I still don't see Tesla the one at fault here. I'm kind of surprised the family has filed suit before the final report comes out but maybe they are hoping it will get heard before then and a jury would be sympathetic to them.

The NTSB's preliminary report, 2 pages, can be read in this Forbes article: Investigators Say Tesla Model 3 Driver Killed In Florida Crash Was Using Autopilot

Watch the video of the lawsuit news story from CBS12News on 8/1. Here's a look at the diagram of where exactly the Tesla was in relationship to the truck's trailer at 1:28 min. in. Their is pretty extensive footage of the car's condition afterwards right after that time marker.

accident story - 1.jpg
 
Last edited:
...AP would have been a better name for FSD. It's a marketing term implying more than is really available.
Except that autopilot (without the capital A, as used on aircraft the world over including large passenger jets) behaves in an almost identical way to Autopilot in its abilities to keep an aeroplane flying safely and in the right direction. It is the fact that laypeople do not know what the limitations of autopilot are that causes the problem. However, it was, indeed, a poorly researched name for the product in my view. (UK-based, AP-equipped Model S owner of 4 years and Tesla investor since 2011 - and qualified pilot!).
 
  • Like
Reactions: jlv1 and erik_k
...Won't know the reason why he apparently took no action...

The driver did take some actions immediately prior to the collision:

10 seconds before the collision: Autopilot was turned on
2 seconds later or 8 seconds before the collision: Steering wheel torque sensor did not detect any from the driver's hands.

If it was me, I would rather ready to apply the brakes and not watching to see whether Autopilot would brake for me!