Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

First pedestrian killed by Autopilot, family sues

This site may earn commission on affiliate links.
It could also be argued that any significant change of speed or a change of operating mode (follow car in front to follow lane) should require acknowledgement by the driver. They require acknowledgement for lane changes... Except in "Mad Max" mode, because that's the kind of responsible safety minded engineering Tesla does.

FYI: You can turn off acknowledgements in any mode, not just Mad Max.
 
  • Informative
Reactions: 1 person
If anything, drivers should be paying even more attention while on AP rather than relaxing.

Ideally, yes. But the problem is that advanced L2+ systems like AP actually do the opposite. They encourage the driver to pay less attention, not more. The reason they encourage drivers to pay less attention is because they take over so many of the driving tasks that they can create the illusion of "self-driving" which can lull the driver into a false sense of security. At the point where the car can do lane keeping and cruise control and the driver does not think they need to hold the wheel or touch the pedals, it can be easy to think that the car is driving, when it really isn't. For example, in some highway driving cases, AP can seemingly handle highway driving on its own for hours with little to no issues. In fact, I routinely see Tesla owners say things like "My car is self-driving. AP drove me 200 miles and I did not have to do anything." In those conditions, it's easy to think that you don't need to pay as much attention.

Of course, this is a false sense of security because while the car is controlling steering and braking on its own so the driver does not need to do those tasks, the car is not actually paying attention to its environment like it should. So we get cases like this accident where a car in front cuts-out and TACC just does what it is programmed to do, accelerate back to its set speed, without paying attention to pedestrians in the way.
 
Last edited:
So.. I am not going click the link because I feel these titles are clickbait... but someone explain to me why TESLA is getting sued, and not the "dozing driver" who is actually at fault, beside deeper pockets?
Agree, I didn't click on it as well.
Here is more information on it.

It also makes it clear that AP was engaged.
https://www.courthousenews.com/wp-content/uploads/2020/04/Tesla-Death.pdf

Thanks for sharing the courthouse news. There's a lot of information here. Sorry for the families loss. We're in the second season of Bull and I'm thinking Tesla may need to hire the Doctor.
Oh forgot to mention, I love all these complaints that people have about the autopilot system and all that it can't do, but they'll go right ahead and use it.
 
So.. I am not going click the link because I feel these titles are clickbait... but someone explain to me why TESLA is getting sued, and not the "dozing driver" who is actually at fault, beside deeper pockets?

As the accident happened in Japan the estate of the killed pedestrian will only be able to get compensation from the driver and his insurance under Japanese law. I guess that the sums you can obtain under Japanese law are significantly less than what you can get in the US. Yet it's only Tesla that may potentially have any liability under US jurisdiction. Therefore it makes perfect sense from the plaintiffs' point of view to go after Tesla.

I'm not sure if their case is entirely without merit. I haven't found the time yet to read their submission but it could be argued that the death was caused actively by the autopilot system. On his own the dozing driver would have hit the accident site at a rather sedate 19 km/h. It was the autopilot that accelerated the car, therefore it might be argued that it should have had better control features to check if it was safe to do so.
 
I'm not sure if their case is entirely without merit. I haven't found the time yet to read their submission but it could be argued that the death was caused actively by the autopilot system. On his own the dozing driver would have hit the accident site at a rather sedate 19 km/h. It was the autopilot that accelerated the car, therefore it might be argued that it should have had better control features to check if it was safe to do so.

I think there are some problems with the case.

The first problem is it tries to argue that the driver monitoring was faulty because the steering wheel torque sensor isn't very good at this. That's an okay argument to make for a 2020 vehicle, but not for a 2016 model vehicle. There is also the issue that even the latest/greatest driver monitoring systems don't always detect the second someone is falling asleep or briefly not paying attention. Having one isn't a guarantee that it would have prevented the accident.

The second issue is it tries to argue about the cut-ins/outs being common, and while that's true it's not true that most L2 systems handle them well. The radar systems are often programmed to ignore stopped objects. Wired has a really good article on it. This means that sometimes these systems are blind to stopped vehicles.

The third issue with it is that it goes on and on about AP being beta technology. That's definitely true of AP2, but this was an AP1 vehicle as it was a 2016 Model. AP1 was built using MobileEye technology which was really proven out. Especially things like TACC which wasn't even really beta at the time.

It also mentions other accidents like the fatality accident involving a Model X in California. But, we now know the driver was playing a video game on his phone during at least part of the drive.

I think its okay to blame AP when it directly contributes to an accident in suddenly making some movement was so immediate, and abrupt that an average human driver wouldn't react in time. But, I'm not okay with the induced argument when it comes to liability in an accident.

To me this feel like it shifts the blame from the driver.

When I had a 2015 Model S with AP1 I really did feel like AP was causing me to lose situational awareness. I wasn't getting drowsy, but I noticed a lagged response to things like debris in the road. I also noticed that I wasn't as clued into vehicles around me. I largely stopped using AP because of this, and truck lust (AP1 liked to snuggle with semi-trailers). Having a better driver monitoring system wouldn't have changed anything since they can't measure situational awareness. It was 100% my responsibility not to allow AP to erode my driving, and the only way I felt like I could do that was to stop using it in most situations.

With my 2018 Model 3 its almost the complete opposite where my driving is sharpened by NoA/AP because I'm always trying to prevent embarrassment. Every trip it's like "nope, you're not going to false break here" or "nope, not going to lane change here".
 
  • Like
Reactions: Rebel44
Ideally, yes. But the problem is that advanced L2+ systems like AP actually do the opposite. They encourage the driver to pay less attention, not more. The reason they encourage drivers to pay less attention is because they take over so many of the driving tasks that they can create the illusion of "self-driving" which can lull the driver into a false sense of security. ....
Link supporting your assertion?
That's not my experience with over 5000 AP miles. I make a 350 mi trip 2-4x per month.
Drives with AP leave me far less fatigued because I'm only supervising the driving rather than constantly making micro adjustments to speed and steering. Using AP in this fashion is simply a safer, better experience.
 
If this is the case I don't think any of the blame will/should be shifted towards Tesla.

Even after years of them being around, some (a lot of) people still do not understand the limitations of autopilot.

You don't think releasing a feature called "autopilot" and then failing to inform many of your customers about what it does and how it works for years on end despite multiple fatal accidents doesn't confer any blame on Tesla?
 
Link supporting your assertion?
That's not my experience with over 5000 AP miles. I make a 350 mi trip 2-4x per month.
Drives with AP leave me far less fatigued because I'm only supervising the driving rather than constantly making micro adjustments to speed and steering. Using AP in this fashion is simply a safer, better experience.

I have the same experience as you. I am less fatigued using AP. But that's not the point. I am not talking about being fatigued or not. I am talking about the level of attentiveness. So yes, you only supervise the driving and don't need to control steering but do you still supervise the driving WITH THE SAME LEVEL of attentiveness after 400 miles as you do after 10 miles? When AP has been perfect for 300 miles, do you start to trust it more and let down your guard a bit or do you still supervise it with the same eagle eye attention? That's my point. Drivers will pay less attention, not because they are tired and can't pay attention, but because they will trust the system more and will start assuming that the system can handle things and they don't need to watch it as closely. Also, it is easier not to pay attention when all you have to do watch the road then when you also need to control steering and braking. And this is a problem because, yes AP might be perfect for 200 miles until it encounters that edge case it can't handle and then an accident happens if the driver was not paying attention in that moment.

Disagree. AP is not expected to "handle" unexpected. Driver is.

True but that assumes that the driver is in a position to be able to handle it.
 
Last edited:
I have the same experience as you. I am less fatigued using AP. But that's not the point. I am not talking about being fatigued or not. I am talking about the level of attentiveness. So yes, you only supervise the driving and don't need to control steering but do you still supervise the driving WITH THE SAME LEVEL of attentiveness after 400 miles as you do after 10 miles? When AP has been perfect for 300 miles, do you start to trust it more and let down your guard a bit or do you still supervise it with the same eagle eye attention? That's my point. Drivers will pay less attention, not because they are tired and can't pay attention, but because they will trust the system more and will start assuming that the system can handle things and they don't need to watch it as closely. Also, it is easier not to pay attention when all you have to do watch the road then when you also need to control steering and braking. And this is a problem because, yes AP might be perfect for 200 miles until it encounters that edge case it can't handle and then an accident happens if the driver was not paying attention in that moment.



True but that assumes that the driver is in a position to be able to handle it.
It is my personal experience that a car moving at 70MPH is more dangerous with a fatigued / distracted driver, absent AP, than with AP. On a 100+ mi trip, driving with AP is safer IMO, and I feel safer using it in this manner.
 
  • Like
Reactions: VT_EE and mikes_fsd
You don't think releasing a feature called "autopilot" and then failing to inform many of your customers about what it does and how it works for years on end despite multiple fatal accidents doesn't confer any blame on Tesla?

The manual explains AP pretty well. You also need to OK an agreement on the car's monitor in order to use it.
 
You don't think releasing a feature called "autopilot" and then failing to inform many of your customers about what it does and how it works for years on end despite multiple fatal accidents doesn't confer any blame on Tesla?


So they explain it very well in the manual- and also mention its limits when you first enable it in the car- and remind you again every time you turn it on. Also in the feature description on the website.... so.... no.


Also -- AFAIK there's been exactly 1 fatal accident- ever- confirmed from someone using AP on a road it's intended to be used o (the model X in CA on the off-ramp where CA DOT hadn't repaired the attenuation barrier in time).

And the manual is pretty clear on which roads those are.


That's an incredibly low death rate given the amount of miles driven on AP. And if the barrier had been fixed the death total almost certainly goes from 1 to 0.
 
Unfortunately for Tesla the law doesn't care if it's an "incredible low death rate", all it cares about is if Tesla is responsible or not.

Given that the NTSB just released a report saying that autopilot is inadequate and needs more regulation it seems like they have a case.
 
Unfortunately for Tesla the law doesn't care if it's an "incredible low death rate", all it cares about is if Tesla is responsible or not.

Except, the law DOES care.

If something kills people LESS often than NOT having it- that's the OPPOSITE of a liability.

But apart from that- The two non-highway deaths were investigated and in both cases Tesla was found to not be responsible.

The idiot using the system someplace it's explicitly, in writing, not supposed to be used, were responsible.

Even the one HIGHWAY case- They found that he was playing a mobile game prior to the crash, and said that was “likely” the reason why he didn’t try to turn away from the barrier. They bitched that the NHTSA (which unlike the NTSB can set rules) wasn't doing a good enough job setting rules to require better driver engagement checking- but Tesla was entirely compliant with EXISTING rules with their current system.



Given that the NTSB just released a report saying that autopilot is inadequate and needs more regulation it seems like they have a case.


It really doesn't.

Because the NTSB doesn't actually set laws or policy.

Basically they said it doesn't, in their opinion, do enough to tell if the driver is distracted but that's the fault of the NHTSA- which, not the NTSB, is the org that can change that... and that until the RULES change there's no requirement for Tesla to do so.

"Did everything legally required- but that wasn't enough to stop an idiot from killing himself by playing video games while driving" is a pretty tough case to win in court I'm afraid.
 
When you arm chair lawyers get your law degrees maybe your opinions will be anything but ignorant. @diplomat33 is the only poster to not be a lazy complainer about OUR legal system.

This is a product liability issue. AP didn't work as designed and it failed to adhere to it's limited warnings on operation. The wheel torque sensor is defective for it's purpose. It's clearly badly engineered with regard to driver monitoring. Tesla is going to lose and should. They profit from a poorly designed product and need to do better. The innocent pedestrian is being called sue happy by sociopaths that blame a dead person's family for trying to get justice. Sick. Tesla should be held to account for it's failures as should the driver. Both are at fault.

View attachment 539052
Right. Usually your bizarre logic, no feature should be released until it is proven 100% reliable, safe, and impervious to human mistakes. No such system exists or will ever exist. Let's not forgot the fact that the driver is ultimately responsible for safety at this time.
 
  • Disagree
Reactions: croman
Right. Usually your bizarre logic, no feature should be released until it is proven 100% reliable, safe, and impervious to human mistakes. No such system exists or will ever exist. Let's not forgot the fact that the driver is ultimately responsible for safety at this time.

No but if it's defective, you fail to warn of risks, or you commit other torts then that's what the law says. Don't have a tantrum being held to account if you put products into commerce. This isn't the 1800s with real coke. We are civilized here. If you don't want passed laws, move to Afghanistan.