Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog Tesla Sued for Deadly Crash on Autopilot

This site may earn commission on affiliate links.
The family of a man who died last year after his Tesla Model X crashed on a California highway while operating on Autopilot, is suing Tesla.

The suit alleges wrongful death and negligence stemming from failures and false promises regarding the Autopilot driver-assistance system.

The incident took place on March 23 on a busy stretch of Highway 101 when Apple engineer Walter Huang’s vehicle drifted out of its lane and crashed into a concrete rail. The car’s battery erupted into flames.

The National Transportation Safety Board reported later that the car had accelerated from 62mph to 70 mph four seconds before the crash.

Tesla published a blog post in March 2018 defending Autopilot as not responsible for the crash.

“Our data shows that Tesla owners have driven this same stretch of highway with Autopilot engaged roughly 85,000 times since Autopilot was first rolled out in 2015 and roughly 20,000 times since just the beginning of the year, and there has never been an accident that we know of,” the company said in the post. “There are over 200 successful Autopilot trips per day on this exact stretch of road.”

Tesla said the reason this crash was so severe is that the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had either been removed or crushed in a prior accident without being replaced.

 
Last edited by a moderator:
Something tells me you're not a big believer in personal responsibility... Tesla is 0% at fault for this crash and the driver is 100% at fault, case closed.

Jeff
Aux contraire, mon ami!
I was the first to mention that if indeed on AP, the driver had himself to blame for it. I wasn't too understanding.
Not only do I not want this driver behavior on the road, I do not want them representing BEV drivers.

At the same time, Tesla KNOWS that utter morons are buying their cars and they're barely preventing them to be morons at the wheel. Wheels seen costume changes and naps. No pricing scheme can prevent that. And they woný monitor eye movements, because, well, because!

While it's a nice attempt at deflecting the issue, software is fallable and the people at Tesla know it. Their moron customers DIE on AP. In non-AP cars, they would likely have passed that concrete lane divider in good health.
So tell me this. What is so hard, out of precaution, to disable this section of road. For crying out loud, it was not a one-off incident. A Tesla vlogger tested his car and it seemed to repeat the fateful behavior to a T. Not worthy an issue to investigate and release a fix for? Not worthy causing teh grievance of 300m of self-driving for other Tesla drivers passing the spot where one of them just lost his life?
 
  • Like
Reactions: Cheburashka
There is a stretch of road I drive in Arkansas that, while not as busy as a California highway, has a similar issue with the autopilot. It's as if the AP loses it way for a very few seconds and wanders across the line and back. It occurs almost every time I drive it and I know where it occurs; thus, I anticipate and adjust for it. I do not feel unsafe using my autopilot.
 
I anticipate and adjust for it. I do not feel unsafe using my autopilot.

The cases where autopilot fails are those that humans navigate with ease. Consequently, the average driver will NOT, I repeat, NOT anticipate these failures and will be caught unprepared. He or she has no concept of the difference between regular intelligence and artificial intelligence, mistakenly believing that the AI will behave in a human-like manner and not make obviously fatal bone-headed moves like driving into a gore-point. Humans are simply far better at determining context and object recognition and threat detection.

Not only that, but in this particular case, as I recall, the car was pulling into the gore-point, and he was correcting, and then a firmware update seemed to temporarily correct the problem. So it may be that a subsequent update reintroduced the bug, again, catching him by surprise, after he mistakenly believed that all software updates should be improvements and not degrades. (after all, it was still wrenching into gore-points after the fatal accident as the youtube reenacters documented) Bottom line is that it's not reasonable to expect drivers utilizing autopilot to babysit the car for bugs, especially bugs that disappear and then reappear unexpectedly. It's there as a convenience feature, presumably so you can reduce your cognitive load. Being ever vigilant like that defeats the purpose.

So I welcome this lawsuit as it should force the system to ascribe some proper liability, as I am tired of the wagon-circling and victim-blaming the fanbois spew out of their keyboards.

Elon has a habit of avoiding blame until he realizes he's completely backed into the corner (like his stubborn resistance to the SEC). This has been internalized by fanbois and it's really dysfunctional egotistical/tribal MAGA-hat-like behavior.
 
Last edited:
What proof does anyone have that the driver wasn't paying attention?

Plenty of times I've had my hand on the wheel in autopilot and the car didn't think I did.

Plenty of times using autopilot the car has suddenly jerked in one direction or another or braked suddenly, so that even if I was paying attention it could have caused a serious accident if another vehicle or a wall was in the wrong place.
 
What proof does anyone have that the driver wasn't paying attention?

Plenty of times I've had my hand on the wheel in autopilot and the car didn't think I did.

Plenty of times using autopilot the car has suddenly jerked in one direction or another or braked suddenly, so that even if I was paying attention it could have caused a serious accident if another vehicle or a wall was in the wrong place.
The logs will show whether there was a steering change into the wall or rather it was driving straight following another car or faded lane lines that became the gore zone with the collapsed barrier, with no signage straight ahead
 
What proof does anyone have that the driver wasn't paying attention?

Plenty of times I've had my hand on the wheel in autopilot and the car didn't think I did.

Plenty of times using autopilot the car has suddenly jerked in one direction or another or braked suddenly, so that even if I was paying attention it could have caused a serious accident if another vehicle or a wall was in the wrong place.
I'm a broken record on this... if you check your rearview or sideview mirrors or check your blind spot, you're taking your eyes off the road for a few seconds. If the guy ahead of you slams on his
brakes, you're gonna hit him(and be at fault of course) but you weren't playing with your phone. You were being responsible and an accident still happened. Those people who recreated the
gore point accident only had a couple of seconds to respond/hit the brakes... and they were paying close attention. one cannot assume the driver was 'distracted' or acting irresponsibly.
 
Tesla should not make any statement if the best they can do is count how many times other drivers did not crash in the same place. I don't think Tesla is to be blamed for this, but citing the number of success stories has absolutely nothing to do with whether their technology was at fault in this case. Considering the number of over-the-air software updates, we don't even know how many of these successful trips were using the same software and data.

I'd like to know whether the acceleration from 62 mph to 70 mph was due to the driver pressing the accelerator pedal or the AP deciding to accelerate above the speed limit. I'm sure Tesla's log files have other details about whether the driver was still holding the wheel at the time, and whether he steered into the wall or failed to brake or failed to steer away from the obstacle.

As others have already said, the loss of life is always sad. Humans make mistakes. It's not clear who is to blame, and Tesla hasn't done a good job if that's the only press release.
 
like to know whether the acceleration from 62 mph to 70 mph was due to the driver pressing the accelerator pedal or the AP deciding to accelerate above the speed limit.

Driver set AP to 70 and when car in front got out of the way AP resumed the speed that the driver set and
AP did not detect that gore zone wasnt a proper lane and
AP didnt detect the barrier which also didnt have a large sign at its point as they often do and should have and, most importantly,

Caltrans failed to reset the crash attenuator and repaint faded lane lines.
 
Last edited: