Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AP/FSD related crashes

This site may earn commission on affiliate links.
I don't recall this crash was discussed in this thread, but posting here anyways.
Media were quick to name Tesla’s driver-assist features as a possible cause after the NHTSA included the accident as part of its probe into alleged Autopilot related accidents.

Uh huh... 🙄
 
Jury rules that Tesla not liable when AP used in city streets and veers into curb.
 
  • Informative
Reactions: Dewg
Jury rules that Tesla not liable when AP used in city streets and veers into curb.
Makes sense to me. For one, AP should not be used on city streets. This is the most telling portions:

Tesla denied liability for the accident and said in a court filing that Hsu used Autopilot on city streets, despite a user manual warning against doing so.

In Los Angeles Superior Court on Friday, the jury awarded Hsu zero damages. It also found that the airbag did not fail to perform safely, and that Tesla did not intentionally fail to disclose facts.

After the verdict, jurors told Reuters Tesla clearly warned that the partially automated driving software was not a self-piloted system, and that driver distraction was to blame.
 
  • Like
Reactions: DanCar
Makes sense to me. For one, AP should not be used on city streets. This is the most telling portions:

Does it actually say that "AP should not be used on city streets" in the Model S manual. I didn't find that part. Just something about narrow roads, and "intended for use on highways" or some such words. Does it clearly say YOU MUST NOT use on city streets or just YOU SHOULD NOT. Is SHOULD NOT really enough of a statement to eliminate their liability?

Also - and perhaps not relevant to this exact jury trial, but more of a general question - if you had engaged it on the highway and it doesn't disengage when you get onto the exit ramp, and then it stays engaged on the city streets, is it your fault for not turning it off? If the car continued driving then maybe you'd think 'I guess it's ok then'?
 
Does it actually say that "AP should not be used on city streets" in the Model S manual. I didn't find that part. Just something about narrow roads, and "intended for use on highways" or some such words. Does it clearly say YOU MUST NOT use on city streets or just YOU SHOULD NOT. Is SHOULD NOT really enough of a statement to eliminate their liability?

Also - and perhaps not relevant to this exact jury trial, but more of a general question - if you had engaged it on the highway and it doesn't disengage when you get onto the exit ramp, and then it stays engaged on the city streets, is it your fault for not turning it off? If the car continued driving then maybe you'd think 'I guess it's ok then'?
A judge and jury said it was enough of a statement with the verdict.

Being a level 2 ADAS system, yes it's your fault for not disengaging the system after leaving the freeway, as you are in control of the vehicle.

I could say the same thing for cruise control. If someone exits the freeway with cruise control enabled, and keeps it running, is it the vehicle's fault if it runs a red light and crashes? This is obviously an absurd question and everyone would understand you have to disengage cruise control when leaving the freeway. I think it's just a new technology that people are still getting used to and not fully understanding the capability before using it.

Put another way: if you assemble IKEA furniture without reading the manual, and it falls apart because you didn't heed the warning on page 3, who's fault is it for damaging your items?
 
  • Like
Reactions: fobo and DanCar
The one part that remains to be tested in court is if a Level 2 system does a manoeuvre that is excessively quick or overpowering. Not saying that AP has done that, but some people do say it sometimes takes a strong pull to recover from an Autosteer mistake... and some people are not strong.

If a reasonably alert hands-on driver crashes because of that type of thing, I think there is a case for a dangerous product. We'll see, if that should ever be stated as a cause.
 
The one part that remains to be tested in court is if a Level 2 system does a manoeuvre that is excessively quick or overpowering. Not saying that AP has done that, but some people do say it sometimes takes a strong pull to recover from an Autosteer mistake... and some people are not strong.

If a reasonably alert hands-on driver crashes because of that type of thing, I think there is a case for a dangerous product. We'll see, if that should ever be stated as a cause.
Los Angeles resident Justine Hsu sued in 2020, saying her Tesla Model S swerved into a curb while on Autopilot, and an airbag was deployed "so violently it fractured Plaintiff's jaw, knocked out teeth, and caused nerve damage to her face.”
 
The one part that remains to be tested in court is if a Level 2 system does a manoeuvre that is excessively quick or overpowering. Not saying that AP has done that, but some people do say it sometimes takes a strong pull to recover from an Autosteer mistake... and some people are not strong.

If a reasonably alert hands-on driver crashes because of that type of thing, I think there is a case for a dangerous product. We'll see, if that should ever be stated as a cause.
Los Angeles resident Justine Hsu sued in 2020, saying her Tesla Model S swerved into a curb while on Autopilot, and an airbag was deployed "so violently it fractured Plaintiff's jaw, knocked out teeth, and caused nerve damage to her face.”
The plaintiff didn't claim it was too quick or overpowering to take over. Instead they were trying to claim Tesla misled the plaintiff in to believing the vehicle can drive itself. Most of the lawsuits so far don't come from an angle that acknowledges that AP or FSD places full responsibility on the driver while focusing focus on the driver having no possibility of overriding the system as the basis for suing. So far it doesn't seem there have been instances of this.
 
Good job stopping in time. The driver must have been distracted to not see that.

FSD might have avoided contact if it didn't turn into the direction the child was running as the adult appeared to stay at the double yellow line. But it's a split second guess FSD makes a second or more earlier whereas a normal human could make split second, last moment control changes.
 
Even if FSD goes non-Beta, that's just the overall software package -- Autosteer on City Streets will still be Beta.

Otherwise not sure what people are expecting from this considering every Autopilot module except Autopark remains Beta to this day, it's just buried in the manual.
 
  • Like
Reactions: SidetrackedSue
Even if FSD goes non-Beta, that's just the overall software package -- Autosteer on City Streets will still be Beta.

Otherwise not sure what people are expecting from this considering every Autopilot module except Autopark remains Beta to this day, it's just buried in the manual.
You can thank Google for ‘innovating’ the idea that ‘beta’ means can’t be arsed to commit to this working rather than we’re pretty sure this is ready for imminent release, which is what it used to mean.
 
  • Like
Reactions: AndrewZ