Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

First pedestrian killed by Autopilot, family sues

This site may earn commission on affiliate links.
Right. Usually your bizarre logic, no feature should be released until it is proven 100% reliable, safe, and impervious to human mistakes. No such system exists or will ever exist. Let's not forgot the fact that the driver is ultimately responsible for safety at this time.

Except nobody is claiming that a system needs to be 100% perfect 100% of the time. Obviously that is unrealistic. The issue is whether the fault was responsible for the accident or not. That's why accidents are investigated, to figure out exactly what happened and whether a defect or flaw was responsible for the accident or not.

But frankly, I find this whole argument that the driver is responsible therefore we should not question the technology to be bizarre. Yes, the driver is responsible since AP is L2. And yes, no system is 100%. But we should still try to prevent future deaths if possible. Just because no system is perfect, does not mean we should just do nothing and let more deaths happen that maybe could be prevented. And a better driver monitoring system would help make AP even better and even safer. So that should be explored as well.
 
  • Like
Reactions: GZDongles
No but if it's defective, you fail to warn of risks, or you commit other torts then that's what the law says. Don't have a tantrum being held to account if you put products into commerce. This isn't the 1800s with real coke. We are civilized here. If you don't want passed laws, move to Afghanistan.
"Fail to warn of the risks"? The manual is pretty explicit of what it can and can't do. It also makes it clear that the system requires driver supervision.
 
Here's the question: yes, the driver is responsible but should Tesla do more to help make sure that the driver is paying attention in order to try to prevent these crashes or is it enough that Tesla warns the driver that they are responsible?
 
Except nobody is claiming that a system needs to be 100% perfect 100% of the time. Obviously that is unrealistic. The issue is whether the fault was responsible for the accident or not. That's why accidents are investigated, to figure out exactly what happened and whether a defect or flaw was responsible for the accident or not.

But frankly, I find this whole argument that the driver is responsible therefore we should not question the technology to be bizarre. Yes, the driver is responsible since AP is L2. And yes, no system is 100%. But we should still try to prevent future deaths if possible. Just because no system is perfect, does not mean we should just do nothing and let more deaths happen that maybe could be prevented. And a better driver monitoring system would help make AP even better and even safer. So that should be explored as well.
According to the statistics released by Tesla for 1Q 2020(assuming they are accurate), there are far fewer accidents with Autopilot enabled, than without. Nobody is advocating nothing should be done to improve the system. Tesla should be constantly taking situations like these and using the data to make improvements. The issue here is suing Tesla for an accident where the driver wasn't paying attention with the system working as designed at the time. It used radar as the primary sensor and discarded the signature of the stationary object (based on my understanding of how early TACC worked). The manual clearly states that the system is not 100% and the driver is responsible. In many of your posts I get the impression Tesla shouldn't release features until they can guarantee it will properly react in 100% of situations. I view this as unrealistic and an impediment to technological advancement. I accept the terms stated when I activate the system, and am 100% responsible preventing an accident.
 
Simplistic analysis is never going to be fruitful. Clearly anyone that takes Tesla reporting data and doesn't put a million caveats based on where Tesla AP is used and how it isn't comparable at all to general miles driven. That is specious and Tesla under actual legal scrutiny won't come out smelling roses. They might have defenses but most jurisdictions apply comparative fault (assign liability based on percent of fault). No matter how you slice this Tesla deserves some blame for putting out a system that can be abused easily (another product liability ground). Foreseeable misuse is also a liability.
 
  • Disagree
Reactions: VT_EE
If the court sides with the plaintiff, this will be a very bad decision in the long run.
The requirement for 100% perfection out of advanced safety systems will keep them off the market.
Or at least delay them until a more intelligent judge hears the case.
 
  • Like
Reactions: mikes_fsd
If the court sides with the plaintiff, this will be a very bad decision in the long run.
The requirement for 100% perfection out of advanced safety systems will keep them off the market.
Or at least delay them until a more intelligent judge hears the case.

Again, nobody is asking for a 100% perfect driver assist system. Everybody recognizes that driver assist are not designed to handle every situation. What some are asking is that Tesla implement a good driver monitoring system because the current torque system sucks.

If you have a proper driver monitoring system, then the driver is able to pay attention and intervene when the driver assist can't handle something.

Without a good driver monitoring system, you have a driver assist that needs an attentive driver with an unreliable driver that may or may not be paying attention.

It should be obvious that having a driver assist that needs an attentive driver with a driver who may or may not be paying attention is not a safe combination.
 
  • Disagree
Reactions: SO16 and mikes_fsd
But are they? The NTSB and others have repeatedly told Tesla to fix the driver monitoring system and Tesla has ignored every request.

Improving driver monitoring would dramatically decrease this sort of accidents but my opinion is that it can't be done just by steering wheel sensors, no matter how the tune it or how often it nags.

Tesla really should get over the privacy issue nonsense and work on a proper interior driver monitoring system with a good eyetracker. It really isn't that difficult, the gaming world already has this where viewers cam see exactly, real time, where the player is looking on the screen.

Other manufacturers do this already.
 
But are they? The NTSB and others have repeatedly told Tesla to fix the driver monitoring system and Tesla has ignored every request.


The NTSB doesn't get to make rules though.

The NHTSA does- and continues to NOT require Tesla to change their monitoring system.

Several times now the NTSB threw a hissy about the fact the NHTSA keeps ignoring them on this.
 
  • Like
Reactions: diplomat33
The NTSB doesn't get to make rules though.

The NHTSA does- and continues to NOT require Tesla to change their monitoring system.

Several times now the NTSB threw a hissy about the fact the NHTSA keeps ignoring them on this.

True. The NTSB can only make suggestions. Hopefully, regulators will require a driver facing camera at some point. .
 
  • Disagree
  • Like
Reactions: SO16 and cucubits
If the court sides with the plaintiff, this will be a very bad decision in the long run.
The requirement for 100% perfection out of advanced safety systems will keep them off the market.
Or at least delay them until a more intelligent judge hears the case.
How is having the car accelerate on its own a safety system?
Do you think ruling for the plaintiff would cause manufacturers to remove systems that brake for pedestrians? That wouldn't even be possible since they're required by law in 2022.
 
How is having the car accelerate on its own a safety system?
Do you think ruling for the plaintiff would cause manufacturers to remove systems that brake for pedestrians? That wouldn't even be possible since they're required by law in 2022.

All Adaptive Cruise Controls on the market "accelerate on their own".

Automatic Emergency Braking and Pedestrian Alert Systems are optional in most countries. Why take the litigation risk? No system built by humans is 100.000% accurate. So it's not a question of whether you get sued, it's a question of when and how much.

Ford Pintos did not have more fire deaths than their comtempories. But that was never the issue. They lost biggly even though everyone had the same problem. Their mistake was acknowledging a risk level. IIRC, VW had over 4x the fire death stats of the Pinto.
 
All Adaptive Cruise Controls on the market "accelerate on their own".
That doesn't mean they're a safety feature. I suspect a system that only brakes on its own would be safer than a system that both brakes and accelerates. I'm not saying that they shouldn't be legal but I view it as a convenience system more than a safety system.
Automatic Emergency Braking and Pedestrian Alert Systems are optional in most countries. Why take the litigation risk? No system built by humans is 100.000% accurate. So it's not a question of whether you get sued, it's a question of when and how much.
They're going to be required pretty much everywhere soon. I think there's a big difference between this case and your hypothetical case. I'm sure there have been many cases of pedestrians being hit by cars with AEB already.
 
That doesn't mean they're a safety feature. I suspect a system that only brakes on its own would be safer than a system that both brakes and accelerates. I'm not saying that they shouldn't be legal but I view it as a convenience system more than a safety system.

They're going to be required pretty much everywhere soon. I think there's a big difference between this case and your hypothetical case. I'm sure there have been many cases of pedestrians being hit by cars with AEB already.

How many AEB cases were settled? Oh, wait that sticky NDA stuff. Opps.

Since ACC will slow a car down rather briskly when traffic slows, it is a safety system and like ABS, SC, and AEB it's all grouped into the entire braking/chassis control system.

Major automakers have demonstrated to the press their AV system developments since the 1960's. In earnest starting in 2010.

I track tested the first mass produced Stability Control System. It had some bugs.
If a class action would have kicked in, SC as we know could have be delays at the cost of thousands of lives.
 
  • Like
Reactions: Earthpower
The driver is responsible until cars can drive themselves. Autopilot requires driver attention same as using brakes and cruise control. I think those suing are looking for extra $. If drivers can blame their cars for accidents no one will be responsible. Will people then sue Ford, Toyota, Honda, Tesla, etc?
 
If this is the case I don't think any of the blame will/should be shifted towards Tesla.

Even after years of them being around, some (a lot of) people still do not understand the limitations of autopilot.

When I keep saying that it will happily run into stationary objects (same as any other "smart" cruise control operated cars), I'm being told I'm crazy. The irony is that both sides usually disagree with that: fanbois say that it will not and that AP is the smartest invention known to man and haters will say that only Tesla does this, that it's broken and not working as intended.

For now this is not a fault, it's a known limitation, it's how radar cruise control works. Stationary objects are being filtered out. Sure AP cameras may or may not see something and brake but no one should rely on that. If anything, drivers should be paying even more attention while on AP rather than relaxing.

IMHO, AP is a dangerous toy. As this poster said, you should be WAY MORE ALERT when using it than when not. It’s kinda like putting a five year old in your lap to drive in traffic—but a 5 year old that has teenage stubbornness traits. I use it from time to time—of course I think it’s cool and I’m curious—but I only use while on high alert and expecting something strange to happen at any moment.

I think if you were to fully relax with it, you’d be putting yourself and others at risk. I don’t have trouble with putting myself at risk—it’s the “others” I’m concerned about!
 
  • Like
Reactions: cucubits
AP Should have been called Co-Pilot. It is best used as a safety device not a multitasking app for cars.

No because if/when the vehicle is fully autonomous, they’d need to change the name. Besides, boats and planes have been using that same name for years and hasn’t needed to be changed. This whole “if the name was different this wouldn’t happen” nonsense, has really got to end.

If something as simple as a name confuses people even though they are warned to pay attention and keep their hands on the wheel EVERY TIME the system is turned on, then there is no hope for those people. That is just a cop out.
 
  • Like
  • Disagree
Reactions: VT_EE and MXLRplus