Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Sued for Deadly Crash on Autopilot

This site may earn commission on affiliate links.
The family of a man who died last year after his Tesla Model X crashed on a California highway while operating on Autopilot, is suing Tesla.

The suit alleges wrongful death and negligence stemming from failures and false promises regarding the Autopilot driver-assistance system.

The incident took place on March 23 on a busy stretch of Highway 101 when Apple engineer Walter Huang’s vehicle drifted out of its lane and crashed into a concrete rail. The car’s battery erupted into flames.

The National Transportation Safety Board reported later that the car had accelerated from 62mph to 70 mph four seconds before the crash.

Tesla published a blog post in March 2018 defending Autopilot as not responsible for the crash.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in the post. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

Tesla said the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.

 
Last edited by a moderator:
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.
 
  • Like
Reactions: Darren Donovan
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.

So a nag every 30 seconds doesn't constitute as Tesla not doing enough?
 
  • Like
  • Disagree
Reactions: neroden and darxsys
From what I know, the driver had himself to blame for not paying attention. Tesla full well knew this was a risk as it was causing accidents, even deadly ones.
In my opinion, Tesla has been actively doing as little as possible to get drivers to pay attention. Only adjusting their approach and policies due to public/press outrage, not on their own accord. Openly resisting to use tech to monitor the driver's eyes.
The whole concept of "machine learning" is a bit a of an Apollo landing story to me. The barrier had been removed for a while and that exit taken numerous times. Somehow the car not only managed to leave its lane but also miss the immovable object. And I don't see that Tesla learns even through programming for such errors and literal death traps. With years of opportunity for machine learning and programming in between, recently Autopilot again failed to spot a semi crossing the highway, again resulting in a death.
It doesn't matter whether Tesla has a less flawed drivers aid system than others, if still allows conditions to develop where lives are endangered.

I want Tesla to succeed in making cars safer but even the AP incident data they release is an Elonism in itself. They take the miles driven on AP and divide them by accidents. Do the same for miles where AP was not active, neglecting to take into account the limited overlap between the two in roads and conditions and the game changer of AP disabling itself when it gets too difficult, leaving the driver to deal with all the miles that are most likely to produce accidents.

Tesla, due to the humongous blind support from (prospective) buyers, Tesla gets away with utter amateurism across its operations. Latest version of EAP/FSD are developing phantom braking again. Is that added safety or loss of quality?

I don't feel the family deserves much compensation from Tesla but I do hope Tesla comes out the other end stepping it up. And I wish a court would hold them to it. Testing protocols are just so bad, my English vocabulary cannot begin to describe it. "But, but, over the air updates!". Those are a plaster, not a preventive measure.
I keep mentioning the Model 3 brakes (the lack of performance affected thousands of cars and was discovered by a consumer publication), but they really are a skyscraper sized sign that they don't test new versions, don't explore corner cases. Just let the test fleet give the thumbs up, not involve a profession tester to find corner cases. With so many cars on the road, even without AP, someone is going to find out the hard way that someone was changed and not tested.

Let's hope something good comes from this, that should have happened before this person risked his own life and those of others sharing the road with him.

I will say that with all the negative news being dragged out in the press, it seems AP has been quite efficient (or lucky) at keeping collateral damage to a minimum. If Teslas were killing anyone else than occupants, this company would probably not be getting away with what it is now.
Due diligence with FSD Level 5 regulation will be interesting to say the least. Actual scrutiny and independent testing seem to be no-brainer. Someone, Tesla just pushes AP/FSD updates that they just gave the once over, inevitably (if they did give it the once over) knowing about increase phantom breaking, taking wrong highway exits causing driver stress, etc.
 
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.
 
Tesla tells the driver at every start of EAP to always stay alert and in control. It beeps and stops if you remove your hands from the wheel. It reminds you every 30 seconds. What on earth more do you want tesla to do? If you tell someone to be careful with a sharp knife, they don't and cut themselves, is that the knife's fault? Really? Only licensed adults are allowed to use this. Responsibility lies with them!!
 
So a nag every 30 seconds doesn't constitute as Tesla not doing enough?
You'll find that an accident takes less time to develop. Try it enough times with a faulty version of AP and you'll know, or at least your loved ones will. Real time is the way to monitor.

Also, wasn't it much less? People were changing or sleeping on the highway. Why was the nag changed?
 
Very very sad for the families loss and it makes sense they are looking for someone to blame but all these claims of AP killing people or trying is getting thin.

We are and should be responsible and in control of any vehicle and the buck stops with us if we crash.

Guns clearly kill people but someone has to pull the trigger we don’t blame the gun or the makers of it.
 
The guy basically committed suicide as far as I'm concerned. I've been banned for stating this in the past. Ban me again if you want. He knew that AP struggled in that exact same area on previous drives, but he let it crash this particular time. I'm not saying he definitely suicided, but that's just my opinion. It's not like suicides are so rare, that this would be out of the question.
 
The guy basically committed suicide as far as I'm concerned. I've been banned for stating this in the past. Ban me again if you want. He knew that AP struggled in that exact same area on previous drives, but he let it crash this particular time. I'm not saying he definitely suicided, but that's just my opinion. It's not like suicides are so rare, that this would be out of the question.

He was vocal about AP being an issue. So why not pay extra attention?

Maybe....just maybe he may have known there was an accident there before and the driver survived. So to stress his point, perhaps he thought he’d let autopilot crash there to prove a point... not realizing the barrier wasn’t replaced. Long-shot and insensitive about the death, I know. But still his actions (or lack thereof) seem EXTREMELY fishy to me.
 
I felt sorry for the young family. AP certainly has some issues in that location and the broken atenuator only made it worse, but human responsibility cannot be excused from this tragic accident.

I have experienced many times that AP drives into broken lane maker which leads to an entrance to a reverse traffic express way if I don't pay attention but every single time I can brake and steer the car to continue in my lane without any issue. I wouldn't try to see if it will stop in time for emergency braking before crashing into the cross bars and I pay extra attention whenever I use AP in that location or stay in the middle lane to avoid the situation.
 
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.
Every single second. I’m not yet ready to throw caution to the wind (or ever). I just purchased FSD and I’m being very cautious to make sure I understand the limitations.
 
It’s time for people to step up and take responsibility for their own actions. When enabling autopilot, I am reminded to pay attention EVERY SINGLE TIME! This case better get thrown out.

If this is the same person who was complaining about the shortcomings of Autopilot, why wasn’t he paying attention? I’d think he’d be more apt to do so.

We live in an age where companies get sued for not labelling their hot coffee as, "Caution: Contents Hot", and similar warnings to minimize litigation after teens started challenging each other to swallow detergent pods. As for Tesla, one would hope that a blue flashing screen, a vibrating steering wheel and instructions that enhanced autopilot is not fully self driving, would be enough to remind people to keep a modicum of attention on the road. To be fair to all, the question needs to be posed as true positives, false positives, false negatives and true negatives and the cost/benefits of each with respect to specific actions taken by the system for specific potential hazards (or lack thereof in terms of the negatives). Then compare that to what the average (in the full spectrum of alert to not alert) driver does under the same situations.
 
  • Like
Reactions: Eclectic
How does the incompetent California government get off the hook for failing to replace the crash absorber in front of the concrete barrier from a previous (non-autopilot crash) months before?
There are plenty of countries where these crash absorbers don't even exist.
The this had been gone for a while and many thousands of cars managed to drive by safety. AP with all its supposed redundancy and machine learning accelerated into it. It will take thousands of non-AP cars to run into that same lane divided to make AP even close to as safe as a person, on that mile of road. And it's not the only such spot where AP/FSD can just get it wrong. Plus they are badly identified beforehand. From what I see of nav on AP videos, it seems governed by very low res GPS when supposedly it's machine learning from visuals. Many missed exist and lane splits. It's apparently still really hard. No wonder AP is after years still happy to ram a semi from the side without any braking. But also (back by dope demand) ghost braking like a bad itch.

Cali is supposed to be rich, but it's the car that was the driver at fault endangering all life around him by not paying attention. And AP was unaware of the long missing crash absorber, or logic dictates it would have stayed far clear of it. Yet it managed to mistake the lane divider for the correct land. And we're not going to get sensor updates on that front....
 
  • Like
Reactions: Kant.Ing
The family admitted he complained about this section of road before. I don't see how they have a case against Tesla. The guy knew this was an issue section and was warned that you need to take control over AP at any time.

Against the state.... that's another matter if they were required to make repairs to the barrier and didn't do it in the required timeframe.