Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

San Lorenzo family blames Tesla Autopilot for crash that killed teen son: lawsuit

This site may earn commission on affiliate links.
AEB does not stop the vehicle. It merely reduces speed.

Like TACC does other than edge cases, such as a pickup truck making an illegal lane change just feet in front of a car.
They said the Tesla began to slow a second before the collision. They don't say if it was the driver or the AP/TACC/AEB nor by how much it slowed.

What are we hoping for here? The Tesla driver is comfortable enough in the system to not pay attention, or perhaps he was paying attention but just entirely deferring driving authority to the AP.

He, no doubt, was assuming that his Tesla would not crash into things. He was wrong, but why was he wrong? One possibility is TACC did not sense the problem. Only the logs can answer that. Further, AEB did not sense the problem or respond adequately. It is designed to see this problem, the vehicle ahead is hardly stopped or slow-moving, AEB should really have done better.

Another possibility is the driver was pressing the accelerator thus preventing TACC or AEB from braking.

It will be interesting to find out what combination of conditions caused the collision. These cars are not supposed to just run into other moving cars. Something about the car, the sensors, the setup, or the driver's behavior caused it. Or a combination.

Without the collision there was no rollover and no need for the seatbelt. What caused the collision? The Tesla car/driver had time to avoid it
 
They said the Tesla began to slow a second before the collision. They don't say if it was the driver or the AP/TACC/AEB nor by how much it slowed.

What are we hoping for here? The Tesla driver is comfortable enough in the system to not pay attention, or perhaps he was paying attention but just entirely deferring driving authority to the AP.

He, no doubt, was assuming that his Tesla would not crash into things. He was wrong, but why was he wrong? One possibility is TACC did not sense the problem. Only the logs can answer that. Further, AEB did not sense the problem or respond adequately. It is designed to see this problem, the vehicle ahead is hardly stopped or slow-moving, AEB should really have done better.

Another possibility is the driver was pressing the accelerator thus preventing TACC or AEB from braking.

It will be interesting to find out what combination of conditions caused the collision. These cars are not supposed to just run into other moving cars. Something about the car, the sensors, the setup, or the driver's behavior caused it. Or a combination.

Without the collision there was no rollover and no need for the seatbelt. What caused the collision? The Tesla car/driver had time to avoid it
There's more "what ifs" than that. If the pickup maintained sufficient following distance with the truck ahead, it would have been able to safely abort its lane change. If the pickup saw the Tesla before attempting the lane change, the crash wouldn't have happened. There were others that pointed out the cargo truck in front also making a last minute lane change that contributed to the accident (making so pickup couldn't have committed to the change).
 
  • Like
Reactions: rxlawdude
Which should be their incentive until they are comfortable standing behind their product.
Nope. You are good at being wrong though. The product (TACC) increase safety, except in some edge cases. It shouldn’t have to function 100 percent of the time 100 percent correctly in order to be released, with a responsible driver it will still increase saftey overall.
 
Nope. You are good at being wrong though. The product (TACC) increase safety, except in some edge cases. It shouldn’t have to function 100 percent of the time 100 percent correctly in order to be released, with a responsible driver it will still increase saftey overall.
This was not some weird "edge case". This was a normal sort of situation (lanes on both sides have slower traffic, car in the left lane signals for a lane change and makes the lane change) that happens constantly when one is driving.

And passive aggressiveness ("you are great at being wrong though") is the last refuge of the dumbass... There is certainly a way to be civil on these boards.
 
  • Disagree
Reactions: rxlawdude
L2 features (both ACC and ACC with lane keeping) reduce mental workload significantly vs manual driving according to a survey of studies. And if that freed mental capacity is used for driving related tasks, it's been shown to help improve situational awareness:
https://ris.utwente.nl/ws/portalfiles/portal/6825168/effects.pdf

That is an eight year-old piece of research that didn't look at any crash data or any of the modern systems that are actually in cars on the road today.

It surveyed a bunch of published, even older, articles that either (i) surveyed users of these systems about how the systems made them feel or (ii) tested certain kinds of task reaction times in a simulator.

I'd hardly take that article as proof of anything; much less as proof that Tesla's various implementations of AP actually increase safety as they are actually used.
 
That is an eight year-old piece of research that didn't look at any crash data or any of the modern systems that are actually in cars on the road today.

It surveyed a bunch of published, even older, articles that either (i) surveyed users of these systems about how the systems made them feel or (ii) tested certain kinds of task reaction times in a simulator.

I'd hardly take that article as proof of anything; much less as proof that Tesla's various implementations of AP actually increase safety as they are actually used.
Well simulations are pretty much the only way to study this in a safe manner. I'm just putting it out there that there's evidence showing such systems reduce mental workload (as all types of automation systems do), and that can help safety. The closest we will get to a survey of modern systems, will be when NHTSA's L2 accident reporting gets underway for a while. Then we'll have better segmented data to compare (for example to control for road types, a common criticism of Tesla's data).
 
That is an eight year-old piece of research that didn't look at any crash data or any of the modern systems that are actually in cars on the road today.

It surveyed a bunch of published, even older, articles that either (i) surveyed users of these systems about how the systems made them feel or (ii) tested certain kinds of task reaction times in a simulator.

I'd hardly take that article as proof of anything; much less as proof that Tesla's various implementations of AP actually increase safety as they are actually used.
So let me ask you: how many Tesla owners use one or more of TACC/AP/EAP/FSD? Exactly how many deaths have been directly attributed to that technology, rather than a human ignoring their duty to take over at any moment?

Now how many deaths have been attributed to the Tesla drivers who don't use TACC/AP/EAP/FSD?

Answer that and your OPINION may be based in some semblance of a fact. The vast majority of your posts are negative towards Tesla, which erodes your credibility.
 
So let me ask you: how many Tesla owners use one or more of TACC/AP/EAP/FSD? Exactly how many deaths have been directly attributed to that technology, rather than a human ignoring their duty to take over at any moment?

Now how many deaths have been attributed to the Tesla drivers who don't use TACC/AP/EAP/FSD?

Answer that and your OPINION may be based in some semblance of a fact. The vast majority of your posts are negative towards Tesla, which erodes your credibility.
So you think we should evaluate safety systems in a hypothetical world where everyone obeys all the rules and never loses focus? If that were the case in the real world we probably wouldn't need any of these safety systems in the first place.
 
Well simulations are pretty much the only way to study this in a safe manner. I'm just putting it out there that there's evidence showing such systems reduce mental workload (as all types of automation systems do), and that can help safety. The closest we will get to a survey of modern systems, will be when NHTSA's L2 accident reporting gets underway for a while. Then we'll have better segmented data to compare (for example to control for road types, a common criticism of Tesla's data).
I guess I'm skeptical (which should surprise no-one) that all incidents from all manufacturers are going to be reported. The severe ones in the media sure, and anything with a clear vehicle data report. But if someone runs over someone's toe or causes damage and doesn't tell anyone they were using Super Cruise (or their car has AEB, etc) at the time then how exactly do these reports get to the manufacturer? There is no requirement for the user to tell the NHTSA, and does every hospital/clinic take detailed information about car model and settings? How is every little crash where I drove into the neighbour's fence on AP with no airbag deployment supposed to be transmitted?

Perhaps they won't get all the incidents, and the NHTSA is just being hopeful. You just have to report the incidents you have 'been made aware of'. Probably a random tweet to twitter@elonmusk or twitter@tesla qualifies.

“Notice” is defined more broadly than in 49 C.F.R. § 579.4 and means information you have received from any internal or external source and in any form (whether electronic, written, verbal, or otherwise) about an incident that occurred or is alleged to have occurred; including, but not limited to vehicle reports, test reports, crash reports, media reports, consumer or customer reports, claims, demands, and lawsuits. A manufacturer or operator has notice of a crash or a specified reporting criterion (i.e., a resulting hospital-treated injury, fatality, vehicle tow-away, air bag deployment, or the involvement of a vulnerable road user) when it has notice of facts or alleged facts sufficient to meet the definition of a crash or a specified reporting criterion, regardless of whether the manufacturer has verified those facts.
 
Are we saying that the accident doesn’t happen if it wasn’t a Tesla? Because I think that is the only way Tesla would be liable here.....
It's going to be an interesting case because as far as we know AP didn't malfunction, and the driver behind the wheel didn't misuse the system. At least that hasn't been reported in anything I've read.

Instead it seems to be a very clear cut challenge on whether the manufacture of an L2 system shares liability in the event of an accident.

With L2 systems there is no way to prevent a driver from getting complacent. Beyond adding various forms of driver monitoring there isn't much a manufacture can do do insure that the driver pays attention. Even GM's supercruise can't detect if someone is day dreaming.
 
  • Like
Reactions: qdeathstar
I guess I'm skeptical (which should surprise no-one) that all incidents from all manufacturers are going to be reported. The severe ones in the media sure, and anything with a clear vehicle data report. But if someone runs over someone's toe or causes damage and doesn't tell anyone they were using Super Cruise (or their car has AEB, etc) at the time then how exactly do these reports get to the manufacturer? There is no requirement for the user to tell the NHTSA, and does every hospital/clinic take detailed information about car model and settings? How is every little crash where I drove into the neighbour's fence on AP with no airbag deployment supposed to be transmitted?

Perhaps they won't get all the incidents, and the NHTSA is just being hopeful. You just have to report the incidents you have 'been made aware of'. Probably a random tweet to twitter@elonmusk or twitter@tesla qualifies.

“Notice” is defined more broadly than in 49 C.F.R. § 579.4 and means information you have received from any internal or external source and in any form (whether electronic, written, verbal, or otherwise) about an incident that occurred or is alleged to have occurred; including, but not limited to vehicle reports, test reports, crash reports, media reports, consumer or customer reports, claims, demands, and lawsuits. A manufacturer or operator has notice of a crash or a specified reporting criterion (i.e., a resulting hospital-treated injury, fatality, vehicle tow-away, air bag deployment, or the involvement of a vulnerable road user) when it has notice of facts or alleged facts sufficient to meet the definition of a crash or a specified reporting criterion, regardless of whether the manufacturer has verified those facts.
Sure, not all accidents will be reported, but that's true of the previous NHTSA data on accidents too (which is mostly based on police reports). However, it'll at least give a general idea of the prevalence of such accidents in the same road types.
 
This was not some weird "edge case". This was a normal sort of situation (lanes on both sides have slower traffic, car in the left lane signals for a lane change and makes the lane change) that happens constantly when one is driving.
TACC
And passive aggressiveness ("you are great at being wrong though") is the last refuge of the dumbass... There is certainly a way to be civil on these boards.

your appeal the civility really looses it appeal when you drop the D-bomb.... and I really do think you are doing a great job and being wrong.
It's going to be an interesting case because as far as we know AP didn't malfunction, and the driver behind the wheel didn't misuse the system. At least that hasn't been reported in anything I've read.

Instead it seems to be a very clear cut challenge on whether the manufacture of an L2 system shares liability in the event of an accident.

With L2 systems there is no way to prevent a driver from getting complacent. Beyond adding various forms of driver monitoring there isn't much a manufacture can do do insure that the driver pays attention. Even GM's supercruise can't detect if someone is day dreaming.
i agree, but from my point of view the accident would have happened either way so I don’t think you can blame Tesla. The driver wasn’t paying attention.

also, I’m surprised that the police found the driver at fault considering the boy wasn’t wearing a seatbelt and swerved into the Tesla’s lane.
 
So let me ask you: how many Tesla owners use one or more of TACC/AP/EAP/FSD? Exactly how many deaths have been directly attributed to that technology, rather than a human ignoring their duty to take over at any moment?

Now how many deaths have been attributed to the Tesla drivers who don't use TACC/AP/EAP/FSD?

One of the most common arguments against L2 is much of the active safety that L2 systems bring can already be implemented with L1+active safety.

Lots of vehicles have Lane keeping systems that keep the driver in the lane. This defeats the supposed gain from safety of L2
Lots of vehicles have adaptive cruise control that defeat the "second set of eyes" benefit of L2. The adaptive cruise control system is what gives us larger following distances which improves safety.
Lots of vehicles have FCW
Lots of vehicles have AEB

We don't currently have Apples to Apples safety data of L1 driving versus L2. Tesla doesn't provide that with their safety reports. They don't bother correcting for where it's used so they compare dissimilar data. That will be the case until FSD Beta has a general release.

It's also really important to do the comparison after the L2 system is sufficiently proficient to induce even the most hardened "I'm not going to trust it" types of people in letting their guard down. In my experience TACC/AP/NoA screws up enough that I tend to be hyper-vigilant with it. My expectation before trusting anything is thousands of miles of zero incidents of it screwing up, and I rarely go more than 100 miles without TACC/AP/NoA screwing up. I have no idea why the driver in this accident allowed this to happen.
 
  • Disagree
Reactions: rxlawdude
but from my point of view the accident would have happened either way
It seems like we'd have to have more information about the driver of the vehicle to make this determination. How many rear end collisions were they involved in before using Autopilot? I'm pretty sure if you're regularly driving on I-880 20mph faster than both adjacent lanes in rush hour traffic without looking at the road you're going to have hit a lot of people.
 
also, I’m surprised that the police found the driver at fault considering the boy wasn’t wearing a seatbelt and swerved into the Tesla’s lane.
The police don't look at what caused what damage. They look at which car was responsible for the collision happening. In a rear-ending like this, the car in back is nearly always responsible.

Also, the police finding is done quickly, and has almost nothing to do with what liability will be found if there is any actual lawsuit.

The purpose of the police finding is almost entirely about who gets the points on their record at the DMV.
 
  • Like
Reactions: qdeathstar
You'll crash quickly anywhere driving without looking at the road.

I try to avoid 880 as much as possible as it's a cluster 24/7, but drivers on 880 are changing lanes and driving faster/slower all the time. Accidents happen regularly and it's been like that way before Tesla/AP have been around.

It seems like we'd have to have more information about the driver of the vehicle to make this determination. How many rear end collisions were they involved in before using Autopilot? I'm pretty sure if you're regularly driving on I-880 20mph faster than both adjacent lanes in rush hour traffic without looking at the road you're going to have hit a lot of people.