Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla wins first US Autopilot trial involving fatal crash

This site may earn commission on affiliate links.

EVNow

Well-Known Member
Sep 5, 2009
19,086
47,769
Seattle, WA

(Reuters) -Tesla on Tuesday won the first U.S. trial over allegations that its Autopilot driver assistant feature led to a death, a major victory for the automaker as it faces several other similar lawsuits across the country.

The case, in a California state court, was filed by two passengers in a 2019 crash who accused the company of knowing Autopilot was defective when it sold the car. Tesla argued human error caused the crash.

The 12-member jury on Tuesday announced they found the vehicle did not have a manufacturing defect. The verdict came on the fourth day of deliberations, and the vote was 9-3.

....

Tesla denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was unclear whether Autopilot was engaged at the time of the crash.

Tesla won an earlier trial in Los Angeles in April with a strategy of saying that it tells drivers that its technology requires human monitoring, despite the "Autopilot" and "Full Self-Driving" names.

That case was about an accident where a Model S swerved into the curb and injured its driver, and jurors told Reuters after the verdict that they believed Tesla warned drivers about its system and driver distraction was to blame.

5123666466_d83199216a_b.jpg

"TESLA logo, freshly unveiled at Fremont Factory" by jurvetson is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Other existing cases may or may not go differently if it's actually proven Autopilot was active at the time, the likelihood of which probably increases as we move up in years and the technology became more developed.

But to me, these cases mostly serve to illustrate the perils of considering taking liability for what vehicles are doing and likely how far away that actually is. The person in this case was going after $400million in damages, from a single crash! Imagine the repercussions of a bad accident in a vehicle with no steering wheel or pedals, or just where liability has actually been assumed, multiplied across millions of vehicles.
 
  • Informative
Reactions: pilotSteve
You hit the nail on the head. If two humans are in an accident, insurance is contacted and there are typically limits - sometimes medical will go over those limits and lawsuits happen to recoup those dollars. However, with a corporation, like Mercedes, that takes responsibility for the accident, the lawsuits will suddenly get ridiculously large - from tens of thousands to millions.
 
You hit the nail on the head. If two humans are in an accident, insurance is contacted and there are typically limits - sometimes medical will go over those limits and lawsuits happen to recoup those dollars. However, with a corporation, like Mercedes, that takes responsibility for the accident, the lawsuits will suddenly get ridiculously large - from tens of thousands to millions.
Exactly, this has been and continues being my belief.

And that's why Mercedes' Level 3 functionality is so hamstrung, only in certain vehicles loaded with sensors on mapped roads and certain speeds with a lead car and even then it's still difficult to activate -- because one bad crash w/ L3 attributable to Mercedes could be a $400million hit to the bottom line. Mercedes' Level 3 I think is a good real-world example of the type of risk mitigation necessary to consider taking liability.
 
But at the end of the day if autopilot prevents just one major injury .its worth every single dollar on the planet to that person.
I've said this many times - AP/FSD Beta will cause accidents, and it will kill people. People need to accept that. The point is that it should cause LESS accidents and kill LESS people than humans do. Last year 42,000 people died in car crashes in the US. If we can save any of those people by using ADAS systems, it's worth it.

The issue is human nature. Why is it more acceptable to us if another human kills a friend or loved one in a car accident vs a computer killing them? We want computer systems to be infallible and cause 0 accidents, but that's not reasonable.
 
Is Autopilot different than what Tesla calls Full Self Driving (FSD)?
Um. Tesla defines these things differently, but fuzzily:
  • Autopilot: Traffic Aware Cruise Control (TACC) and Lane Keep (LK). Won't change lanes on its own. All Teslas come with this.
  • Enhanced Autopilot: TACC and LK, some minimal non-interstate stuff (will stop at stop signs and lights). On interstates, will take off-ramps and on-ramps; if one off-ramp leads to an on-ramp on another interstate, it'll keep on going, so it's called Navigate on Autopilot.
  • Full Self Driving: Not currently a complete product, it truly is a Beta. (Motto, and this is a quote: 'Will do the wrong thing at the worst time.' So stay alert.) Notably, while really staying alert, the car, with a destination in mind, will navigate city streets, highways, and what-all when going from point A to B. How well it does this.. Well, it's a Beta, but, as Betas go, it's seriously much better than it was, say, 16 months ago. But one doesn't take ones hands off the steering wheel or take ones eyes off the road (well, very quick glances are OK) unless one has a death wish. And the "You gotta click through this before you get to use it" text is Very Explicit about what one is getting into.
 
  • Like
  • Informative
Reactions: ionsphere and rlsd
I consider FSD to be an overarching software suite where Autopilot is one module — what people typically call “FSD Beta” in terms of driving around in urban areas is technically called “Autosteer on City Streets” and is another module within FSD.

But FSD is the whole package inclusive of the ability to flip between modules, like going from Autopilot to Autosteer on City Streets depending on road type.


You rarely hear or see this terminology actually used by Tesla though, I’ve only seen it in the leaked Cali DMV letters and a smattering of instances where one of the leadership teams has referenced “City Streets” in presentations/earnings calls — not sure I’ve ever heard Elon talk about it like this.
 
Other existing cases may or may not go differently if it's actually proven Autopilot was active at the time, the likelihood of which probably increases as we move up in years and the technology became more developed.
I'd like to hear lawyers' opinion on this ...

But the baseline is - whats the point of calling it level 2 / ADAS etc if the automaker is responsible ? Only if the automaker says its level 3, should the OEM be responsible.

Also, once a few such cases go Tesla's way, few lawyers will be willing to bring this to court - since they won't get paid a ton.
 
(Motto, and this is a quote: 'Will do the wrong thing at the worst time.' So stay alert.)
Actually, that's not a correct quote. The actual quote is "It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road."

It's "May" and not "Will". Normally I wouldn't nitpick this but you did specially say it was a quote.
 
The person in this case was going after $400million in damages, from a single crash! Imagine the repercussions of a bad accident in a vehicle with no steering wheel or pedals,
one bad crash w/ L3 attributable to Mercedes could be a $400million hit to the bottom line. Mercedes' Level 3
I completely understand the principle of Deep Pockets and the likelihood that AV accident dollar claims and settlements are likely to go up, even as the number of serious accidents goes down.

However, I don't think it's a strong argument to base your analysis and future predictions on an outlandish $400M lawsuit claim, particularly one that was lost before it could ever get to an actual award phase.

It's a bit like searching eBay for insanely high asking prices, with zero actual sales, on items similar to what you happen to have laying around in your closet - and then using that highly flawed data to estimate your own net worth.

Again, I understand that there will be a spate of aspirational sky-high claims as we move into the era of self-driving cars. At least in the US where we specialize in tort insanity.

On the other hand, AVs are equipped with a massive number of recorded camera angles. This discoverable evidence, combined with the idea that true serious misbehavior and fault will be very rare in a released L3/4/5 product, makes me think that the liability risk may be far lower than what people are pessimistically thinking. We all understand the syndrome of an at-fault driver inventing (or maybe actually believing) his own twisted version of events, but this gets a lot harder when the video and telemetry is readily available.
 
  • Informative
Reactions: pilotSteve
I've said this many times - AP/FSD Beta will cause accidents, and it will kill people. People need to accept that. The point is that it should cause LESS accidents and kill LESS people than humans do. Last year 42,000 people died in car crashes in the US. If we can save any of those people by using ADAS systems, it's worth it.

The issue is human nature. Why is it more acceptable to us if another human kills a friend or loved one in a car accident vs a computer killing them? We want computer systems to be infallible and cause 0 accidents, but that's not reasonable.
Agreed!

Why is it more acceptable? Blame (putting the shrink hat on)
For some reason, society has to have someone to blame when bad things happen.
So if another human was driving, they can be blamed. If no one is driving, they'll go after the manufacturer.
Our litigious society thinks there always has to be some entity held accountable.
Truth is, progress is bumpy. Can't make an omelet... as they say. AP/FDS will save far more lives than it costs, and will only get better over time. So the real question is, what's that threshold where the deaths are the acceptable amount? If we're expecting perfection, then it will never be a reality. But if we can quantify (very difficult) saved lives vs caused deaths, where is the line drawn? Is it 50/50? 80/20?99/1?
42k died last year? I would assume with the number of new cars on the road every year, that number trends upward?

Here's a quick example estimation/ thought experiment:
If EV's with AP/FSD were to reach 10% of all cars on the road next year, and there was a 9% drop in fatal car accidents, meaning, out of 42k deaths the previous year, 3780 less deaths.
Are we to focus on the 420 EV-related deaths, or focus on the 3780 less deaths there were overall?
And this assumes that as the EV's on the road each year rises, the deaths will continue to go down as well.
So yes, I agree all lives are important. But do we just stop trying to progress this technology just because it's not perfect?
And keep in mind, anyone driving an EV has chosen to do so. Which, to me, means they should share some responsibility if they choose to use AP/FSD. (causing deaths of others not making that choice is another discussion)

The reality is, accidents happen. Sometimes there is no one to blame.
It's scientific randomness. But our overly religious society thinks someone is always in control of everything. Ludicrous.
It's a plague on the human psyche.

I'm not trying to bash spiritual folks, just stating the base issue regarding this thread. It goes back that far. Many people can't live with the reality that life is random, and have to feel comfort from thinking someone/something else is in control, and the bad things that happen must be for a reason. They aren't. It's just random life.
All the hate in this world comes back to that. People feel cheated or jealous that other's lives seem better than theirs. No matter how different we think we are, we're really all pretty similar if beliefs and prejudices are put aside.
Enjoy the time you have here, love your family, treat people with respect, and everyone can get along and prosper, whatever that looks like to each of us.
 
  • Like
Reactions: Wattsisname
In the 90's when the Greyhound bus line was sporting forward and blind spot radars, the meager goal was giving bus drivers a split second of advanced warning for improved outcomes as well as data for accident reconstruction. Of course the systems back then didn't control steering and such but Greyhound expected fewer lawsuits. And with those old systems there wasn't much talk about forward warning system design snafus, failures, let alone NN training data challenges.

The $10k to $15k folks paid TSLA sets up a target for ongoing lawsuits over negligence. And TSLA claims the moon and waives liability to the fullest extent of the law. Of course these lawsuit results heavily depend on the local jury pool. But even then one wonders if juries will be as accomodating/forgiving especially when otherwise innocent drivers become roadway victims of modern driving conveniences. And I won't be surprised if disgruntled TSLA employees become expert witnesses.

I'm still waiting to hear anything about that multi car pile-up on the SF bridge/under/overpass.

Probably no surprise if the public never hears about cases TSLA loses.
 
Last edited:
  • Like
Reactions: zoomer0056
Again, I understand that there will be a spate of aspirational sky-high claims as we move into the era of self-driving cars. At least in the US where we specialize in tort insanity.
OTOH, if the claims against large companies is same as against non-wealthy individuals, they will care very little about safety and just write off "small" claims as cost of doing business. Even with large exposure, we have seen multiple cases of large companies ignoring safety to save a few bucks or in the case of Uber or Cruise do things that are clearly not safe.

ps :

Tesla denied liability, saying Lee consumed alcohol before getting behind the wheel. The electric-vehicle maker also claims it was unclear whether Autopilot was engaged at the time of the crash.

I think there are very irresponsible drivers who think they can drink and drive, putting autopilot on to help them. They are too drunk to make sure AP is on. It is quite possible AP comes off and they are too inebriated to even figure that out. Tragic - that too with kids in the car.
 
  • Like
Reactions: zoomer0056
In the 90's when the Greyhound bus line was sporting forward and blind spot radars, the meager goal was giving bus drivers a split second of advanced warning for improved outcomes as well as data for accident reconstruction. Of course the systems back then didn't control steering and such but Greyhound expected fewer lawsuits. And with those old systems there wasn't much talk about forward warning system design snafus, failures, let alone NN training data challenges.

The $10k to $15k folks paid TSLA sets up a target for ongoing lawsuits over negligence. And TSLA claims the moon and waives liability to the fullest extent of the law. Of course these lawsuit results heavily depend on the local jury pool. But even then one wonders if juries will be as accomodating/forgiving especially when otherwise innocent drivers become roadway victims of modern driving conveniences. And I won't be surprised if disgruntled TSLA employees become expert witnesses.

I'm still waiting to hear anything about that multi car pile-up on the SF bridge/under/overpass.

Probably no surprise if the public never hears about cases TSLA loses.
If Tesla loses, it would be widely reported, I see no possibility it won't. The media loves to report on Tesla because it drives clicks.

If the case *settles*, however, it might not be reported, but this is par for the course for auto manufacturers. Plenty settle cases where otherwise it would be subject to recall or widely reported (for example the whole GM ignition defect).
 
  • Like
Reactions: sleepydoc