Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla may have been on Autopilot in California crash which killed two

This site may earn commission on affiliate links.
But why apply those extra requirements only to Tesla versus all cars?
Non Teslas have more rear end collisions than Teslas. Non Teslas veer off the road more than Teslas. Non Teslas kill more people due to auto starting in their garage than Teslas. Yet no other OEM is under review due to that. Driver attention is more critical in a non Tesla, but there is no push to force OEMs to implement a driver attention system.

Other auto makers face similar requirements and review all the time too. Government regulators investigate all car crashes, not just Tesla. And I am pretty if they find an avoidable flaw in the vehicle, they notify the auto maker to address the flaw.

But Tesla is in a unique position because Autopilot is blurring the lines between a driver assist system and an autonomous driving system. Autopilot is not an autonomous driving system. It certainly cannot handle all the aspects of driving. And Tesla says that the driver needs to pay attention and has a nag system to try to enforce it. Yet and the same time, Tesla adds new features that make Autopilot more and more capable and claims that Autopilot will be "feature complete" and eventually will be "full self-driving". Owners see Autopilot handle highway driving successfully for hundred of miles with auto lane changes and taking exit ramps, and all they had to do was tug the wheel, and they can get overconfident that Autopilot is autonomous when it is not. And it does not help either that Tesla throws around terms like "hardware capable of full self-driving" which can lead people to assume the car can do more than it really can.

The bottom line is if you have a system that you are trying to make autonomous but is not autonomous yet, then you definitely make sure that the driver is paying attention.
 
Not sure if this has been shared before, but here is a good report on car crash statics. It's a bit dated, but I haven't seen any newer version of this report.

https://www.iihs.org/api/datastoredocument/status-report/pdf/52/3

Not easy to find split data like this, and such data is typically years old.
driving activity based on income level isn't exactly something you can get solid numbers on. Best you can do is figure out the avg income of buyers of certain luxury vehicles... and then see how those luxury
vehicles perform. Are luxury car safety features and airbags really 10x safer than a Toyota or Honda? No, honestly they are pretty similar anymore. I'm sure insurance companies employ some pretty sharp actuaries
who would tell you it is all about the driver. I remember a few years ago Model X buyers were reported to have an average AGI of approx 300k and between 40 and 60yrs old. That demographic buys Volvos, E Class, 7 series, etc.
Not a lot of reckless activity and deaths in those autos.

I like to compare the Model X and the XC90. Similar vehicles, similar drivers, similar safety records. one has autopilot, one does not.
 
Other auto makers face similar requirements and review all the time too. Government regulators investigate all car crashes, not just Tesla. And I am pretty if they find an avoidable flaw in the vehicle, they notify the auto maker to address the flaw.

But Tesla is in a unique position because Autopilot is blurring the lines between a driver assist system and an autonomous driving system. Autopilot is not an autonomous driving system. It certainly cannot handle all the aspects of driving. And Tesla says that the driver needs to pay attention and has a nag system to try to enforce it. Yet and the same time, Tesla adds new features that make Autopilot more and more capable and claims that Autopilot will be "feature complete" and eventually will be "full self-driving". Owners see Autopilot handle highway driving successfully for hundred of miles with auto lane changes and taking exit ramps, and all they had to do was tug the wheel, and they can get overconfident that Autopilot is autonomous when it is not. And it does not help either that Tesla throws around terms like "hardware capable of full self-driving" which can lead people to assume the car can do more than it really can.

The bottom line is if you have a system that you are trying to make autonomous but is not autonomous yet, then you definitely make sure that the driver is paying attention.

There is speaking in generalities and speaking in specifics.
AP provides more coverage than nonAP. However, instead of focusing on that, regulators and media focus on the coverage it doesn't add. Saying Tesla should add lidar for collision avoidance because it is an area the AP is not great at ignores all the cars that have no collision avoidance at all.

Is AP functionality more akin to entrapment/ enabling for driver's bad habits, or is it coercion? I don't text while driving with my fully manual truck nor the wife's SUV with lane assist, blind spot detection, and adaptive cruise. I'll say the wife's is safer due to the secondary checks on my following distance and lane positioning. (Less safe on dealing with center console touch screen).

All systems need to assume the driver is reasonable otherwise features such as sidewalk incursion detection and red light detection would be required. That consumer vehicle are capable of exceeding all US speed limits points to the driver being the source of responsibly. If that base assumption is not allowed, then no cars should be out there.

Lame analogy:
People die in surgery. Yet surgery saves lives. Should surgery be only for cases when it is 100% successful? Do people take more risks due to the ability of medicine to potentially fix them? Does that make the medical profession bad?

Can AP improve? Yes. Does that mean current AP is bad? Does that mean AP is worse than non AP? Which is the same as asking: Is AP with even less features safer than current AP?

Anyone can crash any car. So I ask, "for the same behavior, are you safer in a Tesla or not?"
 
There is speaking in generalities and speaking in specifics.
AP provides more coverage than nonAP. However, instead of focusing on that, regulators and media focus on the coverage it doesn't add. Saying Tesla should add lidar for collision avoidance because it is an area the AP is not great at ignores all the cars that have no collision avoidance at all.

Is AP functionality more akin to entrapment/ enabling for driver's bad habits, or is it coercion? I don't text while driving with my fully manual truck nor the wife's SUV with lane assist, blind spot detection, and adaptive cruise. I'll say the wife's is safer due to the secondary checks on my following distance and lane positioning. (Less safe on dealing with center console touch screen).

All systems need to assume the driver is reasonable otherwise features such as sidewalk incursion detection and red light detection would be required. That consumer vehicle are capable of exceeding all US speed limits points to the driver being the source of responsibly. If that base assumption is not allowed, then no cars should be out there.

Lame analogy:
People die in surgery. Yet surgery saves lives. Should surgery be only for cases when it is 100% successful? Do people take more risks due to the ability of medicine to potentially fix them? Does that make the medical profession bad?

Can AP improve? Yes. Does that mean current AP is bad? Does that mean AP is worse than non AP? Which is the same as asking: Is AP with even less features safer than current AP?

Anyone can crash any car. So I ask, "for the same behavior, are you safer in a Tesla or not?"

If I use AP correctly, yes, I think I am safer with AP. Obviously, if I don't use AP correctly, I am not safer.

We need more information about the cause of the accident. We need to know if AP was at fault or not. If AP was fault, then surely, Tesla should look to address the issue. If AP was not fault, then no, we should not attack Tesla or attack AP. But we should not just dismiss all accidents because "AP is safer in general".

To use your analogy, if it was found that the surgeon used a defective tool, the hospital would surely move to fix that problem. They would not just go "it's ok that somebody died from a defective tool because on average surgeries save more lives." Same with AP. If AP was defective or at fault in some way, then Tesla should address that. We should not just dismiss any issue because "AP is generally safer".
 
  • Like
Reactions: croman
It’s not that simple because the existence of driver assist systems changes behavior.
Yeah, I get that. Thus the given of "for the same behavior" in my simplified question.
Is a car with a top speed of 240 less safe than a car with a top speed of 120?
Assuming a safe driver, they are equally safe.
Using rare edge cases: the 240 is safer in events where you need to go over 120. Alternatively, the 120 is safer in cases when the car fails in a way that is goes its top speed uncontrollably.

We need some baseline to have any useful dialog.

Hypothesis: AP does not make the car less safe (beyond phantom braking which exposes insufficient following distance of the trailing driver), it may however result in the driver's behavior becoming less safe.
 
  • Funny
Reactions: Daniel in SD
If I use AP correctly, yes, I think I am safer with AP. Obviously, if I don't use AP correctly, I am not safer.

We need more information about the cause of the accident. We need to know if AP was at fault or not. If AP was fault, then surely, Tesla should look to address the issue. If AP was not fault, then no, we should not attack Tesla or attack AP. But we should not just dismiss all accidents because "AP is safer in general".

To use your analogy, if it was found that the surgeon used a defective tool, the hospital would surely move to fix that problem. They would not just go "it's ok that somebody died from a defective tool because on average surgeries save more lives." Same with AP. If AP was defective or at fault in some way, then Tesla should address that. We should not just dismiss any issue because "AP is generally safer".

If the tool represents AP, then if the driver is relying on AP, they are at fault. Just like if a surgeon used a known non-sterile scalpel.

Other than in extreme failure modes, AP can't crash a car if the driver has their hands on the wheel and their eyes on the road. Just like a scalpel can't perform surgery.

And this is why analogies are lame.
 
I'm just saying that I think it's reasonable and well within the authority of the NHTSA to regulate advanced driver assistance systems that are used on public roads if they are being abused and actually decreasing safety (not saying that is what is happening! I don't have the evidence either way). It has nothing to do with fault and no one on this thread has said it's not the driver's fault.

This discussion reminds me of the general aviation industry when “glass cockpits” began rolling out 15 years ago.

The FAA decided not to regulate the technology with required training. The FAA (rightly felt) that in a rapidly changing environment, regulatory action was not the most appropriate or effective way to promote safety. Instead, the FAA oversaw the development a “recommended” training program with input from a number of entities (aviation associations, industry, flight schools and academia). This transition in the general aviation industry was very effective (increased safety) without hindering progress.

Just my 2 cents
 
If the tool represents AP, then if the driver is relying on AP, they are at fault. Just like if a surgeon used a known non-sterile scalpel.

Other than in extreme failure modes, AP can't crash a car if the driver has their hands on the wheel and their eyes on the road. Just like a scalpel can't perform surgery.

And this is why analogies are lame.

Question is. If L2 systems (Tesla and noon-Tesla) eventually lead to many more fatal crashes, then what do we do?
 
Last edited:
Has Cadillac's Super Cruise been involved in any accidents?


That would be an interesting statistic: torque based driver assistance systems with no whitelists vs eye based systems with whitelists. Which group has fewer accidents with the system engaged.

It looks like Cadillac has started adding in non-freeways, so it will be driven in more similar scenarios. I notice it is enabled for part of the 395 north of me. I am not sure I would use AP on that road, since it isn’t limited access, so it is interesting Cadillac is whitelisting those types of roads.
 
Question is. If L2 systems (Tesla and noon-Tesla) eventually lead to many more fatal crashes, then what do we do?

They won't. So we don't need to worry about it. Tesla's AP and EAP have been on the roads for, what? two years now? Three? There have not been "many more fatal crashes." The above question is an attempt to introduce FUD into the discussion.

You might as well ask, "What if space aliens decide to disintegrate all Ford cars with their ray guns? Should we ban Fords from the roads?"
 
They won't. So we don't need to worry about it. Tesla's AP and EAP have been on the roads for, what? two years now? Three? There have not been "many more fatal crashes." The above question is an attempt to introduce FUD into the discussion.

You might as well ask, "What if space aliens decide to disintegrate all Ford cars with their ray guns? Should we ban Fords from the roads?"

It's not just Tesla though. As more cars get these functionalities, it's possible for people to start to get distracted. Look at how many distracted people using cell phones while driving, which leads to laws being passed to forbidden use of cell phone when driving. Do you really think people won't turn on their Autopilot and equivalent system and use their cell phones if they have a chance to do so?
 
It's not just Tesla though. As more cars get these functionalities, it's possible for people to start to get distracted. Look at how many distracted people using cell phones while driving, which leads to laws being passed to forbidden use of cell phone when driving. Do you really think people won't turn on their Autopilot and equivalent system and use their cell phones if they have a chance to do so?
From reviews, Cadillac's Super Cruise turns off if it detects the driver is distracted.
 
Question is. If L2 systems (Tesla and noon-Tesla) eventually lead to many more fatal crashes, then what do we do?
Increase requirements for getting a driving license.

You might as well ask, "What if space aliens decide to disintegrate all Ford cars with their ray guns? Should we ban Fords from the roads?"
No need, they would all have been vaporized.

The vast majority of accidents are caused by drivers doing something unsafe. Any study of automobile safety must take into account the behavior of real drivers, not only "safe" drivers.

Yes, but to say whether system X increases or decreases safety on a case by case basis you need to keep the driver variable constant. Otherwise, you end up comparing the accident rate of the safe driver without system X to the unsafe driver with system X.

Is an unsafe driver with AEB/ FCW/ Autosteer safer overall (and are the people around them safer) than the same driver without those additional systems? Do those features, on their own, make the vehicle less safe to operate.

On the human factors side of things, the question can be asked: do the extra safety nets move the drivers from safe to unsafe?
 
Yes, but to say whether system X increases or decreases safety on a case by case basis you need to keep the driver variable constant. Otherwise, you end up comparing the accident rate of the safe driver without system X to the unsafe driver with system X.
You need to use a representative sample of all drivers if you're trying to measure overall safety. Considering only "safe" drivers makes no sense since not all drivers are above average (though most think they are!).
Is an unsafe driver with AEB/ FCW/ Autosteer safer overall (and are the people around them safer) than the same driver without those additional systems? Do those features, on their own, make the vehicle less safe to operate.
That's a very good question. I think there are definitely some drivers who abuse Autosteer to such a degree that they are significantly less safe (and who would not be, say, watching movies while driving if they didn't have Autosteer). It's just a question of how many of them there are relative to people who get a safety benefit from the system. I doubt that the AEB or FCW reduce driver vigilance but it certainly should be studied.
On the human factors side of things, the question can be asked: do the extra safety nets move the drivers from safe to unsafe?
Maybe. It's certainly being studied: New study: Adaptive cruise-control and other driver-assistance systems may increase distracted driving
 
You need to use a representative sample of all drivers if you're trying to measure overall safety. Considering only "safe" drivers makes no sense since not all drivers are above average (though most think they are!).

Sure, I agree there.
I suppose my issue is safer on the system level, vs safer on the system + driver level. i.e. what does is mean to say a system is less safe.

4WD is more capable than 2WD, but is 4WD safer that 2WD if drivers get used to the added acceleration traction, but do not consider that their braking ability is unchanged.

In the event of an inattentive driver, is it better to have AP? I'd say yes
Does having AP cause more inattentive drivers and thus increase the rate of accidents?And if so, should action be taken? This seems to be the issue under debate.

And which will arrive first, Level 4 or an actionable conclusion? :)
 
My guess is at some point regulators will decide torque/touch based attention monitoring systems are not sufficient and will require different methods in future cars.

I know Ford is working on an eye based system like GM in the Mach-E, so I would not be surprised if that ends up being the trend.