Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
May 19, 2017
1,311
564
The family of a man who died last year after his Tesla Model X crashed on a California highway while operating on Autopilot, is suing Tesla. The suit alleges wrongful death and negligence stemming from failures and false promises regarding the Autopilot driver-assistance system. The incident took place on March 23 on a busy stretch...
[WPURI="https://teslamotorsclub.com/blog/2019/05/01/tesla-sued-for-deadly-crash-on-autopilot/"]READ FULL ARTICLE[/WPURI]
 

Cloxxki

Active Member
Aug 20, 2016
1,362
706
Rotterdam
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.
 
  • Like
Reactions: Darren Donovan

Singuy

Active Member
Jun 28, 2018
4,568
36,206
US
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.

So a nag every 30 seconds doesn't constitute as Tesla not doing enough?
 
  • Like
  • Disagree
Reactions: neroden and darxsys

docdeb27

M3 LR AWD Pearl White Prem Int.
Aug 16, 2018
246
144
SHOREHAM
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.
 

SO16

Active Member
Feb 25, 2016
3,208
10,379
MI
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.
 

docdeb27

M3 LR AWD Pearl White Prem Int.
Aug 16, 2018
246
144
SHOREHAM
Tesla tells the driver at every start of EAP to always stay alert and in control. It beeps and stops if you remove your hands from the wheel. It reminds you every 30 seconds. What on earth more do you want tesla to do? If you tell someone to be careful with a sharp knife, they don't and cut themselves, is that the knife's fault? Really? Only licensed adults are allowed to use this. Responsibility lies with them!!
 

Cloxxki

Active Member
Aug 20, 2016
1,362
706
Rotterdam
So a nag every 30 seconds doesn't constitute as Tesla not doing enough?
You'll find that an accident takes less time to develop. Try it enough times with a faulty version of AP and you'll know, or at least your loved ones will. Real time is the way to monitor.

Also, wasn't it much less? People were changing or sleeping on the highway. Why was the nag changed?
 

Fellsteruk

Active Member
Feb 24, 2018
1,040
437
North West, UK
Very very sad for the families loss and it makes sense they are looking for someone to blame but all these claims of AP killing people or trying is getting thin.

We are and should be responsible and in control of any vehicle and the buck stops with us if we crash.

Guns clearly kill people but someone has to pull the trigger we don’t blame the gun or the makers of it.
 

⚡️ELECTROMAN⚡️

SS of 96 and falling
Jul 15, 2016
2,869
5,425
Pacific Northwest
The guy basically committed suicide as far as I'm concerned. I've been banned for stating this in the past. Ban me again if you want. He knew that AP struggled in that exact same area on previous drives, but he let it crash this particular time. I'm not saying he definitely suicided, but that's just my opinion. It's not like suicides are so rare, that this would be out of the question.
 

SO16

Active Member
Feb 25, 2016
3,208
10,379
MI
The guy basically committed suicide as far as I'm concerned. I've been banned for stating this in the past. Ban me again if you want. He knew that AP struggled in that exact same area on previous drives, but he let it crash this particular time. I'm not saying he definitely suicided, but that's just my opinion. It's not like suicides are so rare, that this would be out of the question.

He was vocal about AP being an issue. So why not pay extra attention?

Maybe....just maybe he may have known there was an accident there before and the driver survived. So to stress his point, perhaps he thought he’d let autopilot crash there to prove a point... not realizing the barrier wasn’t replaced. Long-shot and insensitive about the death, I know. But still his actions (or lack thereof) seem EXTREMELY fishy to me.
 

cbdream99

Member
Apr 8, 2017
130
94
DFW
I felt sorry for the young family. AP certainly has some issues in that location and the broken atenuator only made it worse, but human responsibility cannot be excused from this tragic accident.

I have experienced many times that AP drives into broken lane maker which leads to an entrance to a reverse traffic express way if I don't pay attention but every single time I can brake and steer the car to continue in my lane without any issue. I wouldn't try to see if it will stop in time for emergency braking before crashing into the cross bars and I pay extra attention whenever I use AP in that location or stay in the middle lane to avoid the situation.
 

hcdavis3

HCD3
Supporting Member
Mar 3, 2019
2,292
1,498
02571
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.
Every single second. I’m not yet ready to throw caution to the wind (or ever). I just purchased FSD and I’m being very cautious to make sure I understand the limitations.
 

eigenv1

Member
Jul 13, 2017
70
58
Oklahoma
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.

We live in an age where companies get sued for not labelling their hot coffee as, "Caution: Contents Hot", and similar warnings to minimize litigation after teens started challenging each other to swallow detergent pods. As for Tesla, one would hope that a blue flashing screen, a vibrating steering wheel and instructions that enhanced autopilot is not fully self driving, would be enough to remind people to keep a modicum of attention on the road. To be fair to all, the question needs to be posed as true positives, false positives, false negatives and true negatives and the cost/benefits of each with respect to specific actions taken by the system for specific potential hazards (or lack thereof in terms of the negatives). Then compare that to what the average (in the full spectrum of alert to not alert) driver does under the same situations.
 
  • Like
Reactions: Eclectic

Cloxxki

Active Member
Aug 20, 2016
1,362
706
Rotterdam
How does the incompetent California government get off the hook for failing to replace the crash absorber in front of the concrete barrier from a previous (non-autopilot crash) months before?
There are plenty of countries where these crash absorbers don't even exist.
The this had been gone for a while and many thousands of cars managed to drive by safety. AP with all its supposed redundancy and machine learning accelerated into it. It will take thousands of non-AP cars to run into that same lane divided to make AP even close to as safe as a person, on that mile of road. And it's not the only such spot where AP/FSD can just get it wrong. Plus they are badly identified beforehand. From what I see of nav on AP videos, it seems governed by very low res GPS when supposedly it's machine learning from visuals. Many missed exist and lane splits. It's apparently still really hard. No wonder AP is after years still happy to ram a semi from the side without any braking. But also (back by dope demand) ghost braking like a bad itch.

Cali is supposed to be rich, but it's the car that was the driver at fault endangering all life around him by not paying attention. And AP was unaware of the long missing crash absorber, or logic dictates it would have stayed far clear of it. Yet it managed to mistake the lane divider for the correct land. And we're not going to get sensor updates on that front....
 
  • Like
Reactions: Kant.Ing

TMThree

Active Member
Mar 28, 2019
1,118
1,776
USA
The family admitted he complained about this section of road before. I don't see how they have a case against Tesla. The guy knew this was an issue section and was warned that you need to take control over AP at any time.

Against the state.... that's another matter if they were required to make repairs to the barrier and didn't do it in the required timeframe.
 

Products we're discussing on TMC...

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC