Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

San Lorenzo family blames Tesla Autopilot for crash that killed teen son: lawsuit

This site may earn commission on affiliate links.
...Also just saw the statements by the "expert" the Times called in that said radar would have detected the truck and would have prevented the accident, and speculated that the radar data was not being used. Even putting aside this accident was in 2019 (when Tesla definitely was using radar), radar is well known for doing a poor job dealing with changing targets and things partially in the lane....

By its well-known limitation in the car industry, radar ignores stationary obstacles but it excels in detecting moving obstacles. Previous autopilot accidents with fire trucks, police cars partially blocking the lane reflect the limitation: those fire trucks and police cars partially blocking the lane were not moving, they were stationary obstacles. The Model X autopilot collided with the median concrete barrier in Mountain View reflects the same limitation: That barrier was sitting duck, without moving, a stationary obstacle.

This case is very different: The radar is at its strong points because the pickup truck was moving.

It's possible that although some Tesla cars have radar, its priority is affected by the bias against it because Tesla believes in Pure Vision, and not Sensor Fusion.
 
By its well-known limitation in the car industry, radar ignores stationary obstacles but it excels in detecting moving obstacles. Previous autopilot accidents with fire trucks, police cars partially blocking the lane reflect the limitation: those fire trucks and police cars partially blocking the lane were not moving, they were stationary obstacles. The Model X autopilot collided with the median concrete barrier in Mountain View reflects the same limitation: That barrier was sitting duck, without moving, a stationary obstacle.

This case is very different: The radar is at its strong points because the pickup truck was moving.
Nope, ACC radar is bad at partial lane objects also, even when they are moving, as well as slow moving vehicles in general, both criteria the pickup met at the time. Also the indecision of the pickup can also cause confusion too (the pickup started moving into the lane, and then started moving back to previous lane). I didn't mention it, but the huge truck that was in front of the pickup may also confuse the radar (cause a large return that may be treated as false positive).

en_US

Detection issues can occur:
  1. When driving on a different line to the vehicle in front.
  2. When a vehicle edges into your lane. The vehicle will only be detected once it has moved fully into your lane.
  3. When going into and coming out of a bend.
  4. When moving around a stationary vehicle. This may cause uncertainty as to which vehicle should be followed.
  5. When the vehicle ahead turns out of your lane. This may cause uncertainty as to which vehicle should be followed.
In these cases, ACC may brake late or unexpectedly. The driver should stay alert and intervene, if necessary.
https://topix.jaguar.jlrext.com/top...F/19390832-4bad-40d3-ba6a-b5bf903b10a3/en_GB?

"The function does not brake for humans or animals, and not for small vehicles such as bicycles and motorcycles. Nor for low trailers, oncoming, slow or stationary vehicles and objects."
Adaptive cruise control* | Adaptive cruise control | Driver support | XC60 2019 Late | Volvo Support

Here's the warning specific to TACC in Tesla's manual, that mentions all 3 things mentioned above.
"Traffic-Aware Cruise Control cannot detect all objects and, especially in situations when you are driving over 50 mph (80 km/h), may not brake/decelerate when a vehicle or object is only partially in the driving lane or when a vehicle you are following moves out of your driving path and a stationary or slow-moving vehicle or object is in front of you."

It's possible that although some Tesla cars have radar, its priority is affected by the bias against it because Tesla believes in Pure Vision, and not Sensor Fusion.
At the time of the accident Tesla was already using radar as a primary sensor (mentioned in 2016 blog post), meaning it didn't rely on confirmation from the vision system (this was discussed in other threads). This was also in 2019 before Tesla was working on a rewrite of the system.
 
Last edited:
  • Informative
Reactions: Tam
I'm surprised that nobody has mentioned yet that the pickup truck driver stepped on the brake right after he changed into the Tesla's lane.

Technically: Braking from the driver in front would result in a slower-speed obstacle but with a competent Automation system, Tesla should be able to adjust its speed and brakes for that scenario. However, Tesla claims blanket exemption because it's "beta".

Legally: Are you raising an issue that this should be a case to ticket the pickup truck driver for "brake checking"?
 
Technically: Braking from the driver in front would result in a slower-speed obstacle but with a competent Automation system, Tesla should be able to adjust its speed and brakes for that scenario. However, Tesla claims blanket exemption because it's "beta".
It has nothing to do with the beta tag (which likely provides no legal protection). The exemption comes from it being L2 (which lays responsibility on the driver to react, same reason why police report blamed driver, not the car), the fact this situation is a known limitation in L2 systems, so is not considered a defect (see below), and also because Tesla details these limitations in multiple places (4.2 in document below).

The ODI investigation for the 2016 crash points out in section 5.1 the known limitations of ACC.
https://static.nhtsa.gov/odi/inv/2016/INCLA-PE16007-7876.pdf
 
Last edited:
If an L2 car has a bug or hardware fault and 'causes' a crash then it's not the driver's fault. Things go wrong.

It may be a waste of everyone's time (but the lawyer's) in this case, but you never know. At some point an L2 car is going to act in a way that the driver cannot react to in time. If it suddenly swerved, accelerated wildly, braked aggressively with a fast truck directly behind. It could happen in an instant.

Is this case one of those? Who knows. I think they are attempting to try it on the case of the L2 car being 'supposed to avoid the accident'. That has less likelihood of winning IMO. I don't see the Tesla car as having done anything that 'caused' the accident. But at some point one car will. We can't always say L2 cars are not responsible, they can be if they are faulty. (Still I don't see that for this particular case).
 
Recently, I was driving on the highway in a rental car using TACC. After nearly rear-ending the vehicle in front of me several times, I realized that it was just plain-old Cruise Control.

Since I was looking straight ahead, the "COLLISION ALERT" in my brain was triggered, and I manually slowed down. It was reflex, really. Self-preservation instinct kicked in.

If a driver hits something without even trying to slow down, then they weren't paying attention.
 
First of all, this isn't an "Autopilot" story. It has absolutely nothing to do with autopilot.

It's a cruise control story.

Tesla driver was slow to brake while on cruise control but the law clearly says that the driver making the unsafe lane change is the one at fault. Furthermore the truck driver should be facing manslaughter charges for the criminal child endangerment that caused that poor kid's death. It's a *crime* to allow a minor to ride without a seat belt, especially when driving recklessly.
 
One thing I would like to add is that due to the focal length of the cameras, the distances or speeds are not always what they appear to be. I've looked at quite a few recordings from situations I was in and it looked entirely different from the actual situation. Most notably was one situation where some idiot just pulled into an intersection although it wasn't his turn. I had to slam the brake since I was already pretty close, but when I looked at the video later, it appeared as if there was still plenty of space when he pulled into the street.

There is a video showing the effect of wide angle lenses here:

Long story short: in order to determine speeds and distances correctly, they'll have to measure the markings on the street to make a proper assessment. The video as it is just shoes what the plaintiff wants you to see. Realistically the Tesla was probably not as fast as it appears, plus it was probably already much closer to the pickup truck than the video makes it look like.
 
That sounds clear enough but how do you explain in real-life that “The police faulted the Tesla driver..."

Did the police ticketed the pickup driver for unsafe lane change then the reports just got the facts wrong?
That quote was from Tesla's attorney whose only goal is to prove that the car is not at fault. It'd be bad optics to blame the grieving family so he may have selectively only mentioned the Tesla driver's citation. Also the article is framed as "Tesla killed yet another innocent victim" so the journalist might have selectively avoided mention of truck driver negligence.

But nonetheless, the implication that the Tesla driver was the only one cited is likely correct. In a rear end collision it's pretty standard to cite the rear-ender at the scene and let a judge evaluate any mitigating evidence later. But the truck driver would have only been cited if the Tesla driver or another witness insisted.

If that had been the kid's soccer coach driving the truck you can bet the family and police would be blaming him.
 
If we're to blame autopilot then wouldn't we also blame AEB, FCW, Adaptive Cruise control?

The simple fact is that cars are not self-driving so how can we fault a system when 100% of the responsibility is on the driver?

The Tesla driver is 100% at fault for this accident. The video has been slowed down so its hard to tell, but it appears to be plenty of time for the Tesla driver to react and slow down. Edit: The NyTimes video isn't slowed down so that's a better source.

But, the Tesla driver is not at 100% fault for the fatality because the truck driver failed to obey the rules of the road. According to the Nytimes article the kid wasn't wearing a seatbelt, and ultimately that's why he died. The lack of being buckled in turned a moderate accident into a fatality accident.

Now I'm saying this to defend Tesla, but the simple fact is ADAS systems bare no responsibility for an accident when they themselves didn't cause it. I don't agree with allowing for this as I think a better approach would have been to assign some responsibility in an effort to make sure these systems actually worked.

One thing of note is awhile back Tesla introduced code that would slow a vehicle when the speed differential was high. It was a safety thing, but I think Tesla owners got mad and Tesla removed that feature. At least that's my recollection of it.
 
Last edited:
Now I'm saying this to defend Tesla, but the simple fact is ADAS systems bare no responsibility for an accident when they themselves didn't cause it. I don't agree with allowing for this as I think a better approach would have been to assign some responsibility in an effort to make sure these systems actually worked
I'm no lawyer but a manufacturer of product that has "reasonably foreseeable misuse" can have liability.
 
I'm no lawyer but a manufacturer of product that has "reasonably foreseeable misuse" can have liability.
This...

Also, no one is arguing that the driver of the Tesla isn't responsible.

What I'm suggesting is that both the Tesla driver and Tesla (the company) should be liable to the third party (pickup truck fatality) who was killed in this incident. If Tesla wants to sue the Tesla driver for the damages the third party collects from Tesla, fine.
 
  • Disagree
Reactions: rxlawdude
This...

Also, no one is arguing that the driver of the Tesla isn't responsible.

What I'm suggesting is that both the Tesla driver and Tesla (the company) should be liable to the third party (pickup truck fatality) who was killed in this incident. If Tesla wants to sue the Tesla driver for the damages the third party collects from Tesla, fine.
If there is no evidence of defect in the system (this case doesn't show any so far, it's well within the known limitations of ACC in general to not brake in this situation, as discussed above), why should Tesla be held liable? If Tesla is held liable even in the case of no defect, then should every accident that happens with a car with ACC or CC activated have the manufacturer be held liable? If that's true, that would have a chilling effect on the prevalence of ACC or CC features (similar to how automakers are reluctant to push out L3 due to having to assume liability), which may actually decrease safety overall.

As for the latter point, I don't get your point? Why should Tesla have to sue the driver to get damages if they shouldn't be held liable in the first place? The plantiff should be the one suing the driver directly.
 
  • Like
Reactions: rxlawdude
The Tesla self-driving system punishes paying attention by turning itself off if you give even minor steering input. Autopilot IS dangerous. But it's the rule of 3 errors. The truck driver erred. The Tesla driver erred. The family erred. Let's be frank, Tesla erred too.
 
  • Disagree
Reactions: rxlawdude
If there is no evidence of defect in the system (this case doesn't show any so far, it's well within the known limitations of ACC in general to not brake in this situation, as discussed above), why should Tesla be held liable? If Tesla is held liable even in the case of no defect, then should every accident that happens with a car with ACC or CC activated have the manufacturer be held liable? If that's true, that would have a chilling effect on the prevalence of ACC or CC features (similar to how automakers are reluctant to push out L3 due to having to assume liability), which may actually decrease safety overall.

As for the latter point, I don't get your point? Why should Tesla have to sue the driver to get damages if they shouldn't be held liable in the first place? The plantiff should be the one suing the driver directly.
Tesla should be held liable because it does have an defective product.

AP doesn't seem to take into account things like lane-change signals or the speed is cars in neighboring lanes. Yet Tesla encourages drivers to report on it to essentially drive the car, with the driver merely overriding AP if the driver thinks it's made a mistake. Tesla should be liable to the third party.