Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another tragic fatality with a semi in Florida. This time a Model 3

This site may earn commission on affiliate links.
Status
Not open for further replies.
My personal, statistically unsubstantiated, belief is that AP is a helpful driver aid tool that improves safety on average, but is deadly under certain poorly documented and ever evolving circumstances.

What, do you feel, makes AP deadly?
In order for the car to crash into something, the driver has to allow it to. The driver can override the steering, acceleration, and braking at any moment.

In pretty much any other car, if the driver relinquishes control to the car, it will be soon be in the ditch or head on into another car. Does that not make AP less deadly?

Therein lies the problem with Tesla's "AP is safer" argument: disclaimers absolve AP/EAP of responsibility to work properly under any and all conditions, yet Tesla and its fans keep arguing that the AP is still safer than not using it.
A system does not need to be 100% to be better than not having it.
AEB/ FCW are assist systems also. If the car has them and still crashes, that is the driver's fault, not the car's.
Airbags do not prevent 100% of fatalities, nor do seat belts. In some types of accidents airbags specifically do not deploy. In sone cases, the airbag deployment increases injury. However, airbags, on the whole, increase safety/ reduces risk of serious injury.
 
"What, do you feel, makes AP deadly?"

Skills atrophy. That combined with a basic inability of the system to see large stationary objects such as trucks, freeway lane barriers, stopped cars, etc. AP is so good under normal circumstances that heavy AP users eventually get to the point where they trust the system too much, and their own abilities to remain vigilant and ready to take over quickly diminish.

I don't understand how people don't get that. You can debate until the cows come home about the drivers responsibility. Yes, it is always the drivers duty to be in control of the vehicle. Turing on Autopilot cedes physical control and then requires mental control to safely drive the vehicle. That metal sharpness fades with repeated use of AP. It's like my teenage daughter that has trouble with simple math calculations because her phone does all of the work most times.
I did not buy EAP, and will not until such time that the system can handle not killing me via decapitation when a freaking 60 foot tractor trailer crosses in front of me.
It's cool to play around with, and I have. It has also scared me more than once.
Everyone reading this probably has been scared at some point.
Look, I love Tesla and wish them the best getting FSD to market first.
I just don't want to die helping them get there.
 
"What, do you feel, makes AP deadly?"

Skills atrophy. That combined with a basic inability of the system to see large stationary objects such as trucks, freeway lane barriers, stopped cars, etc. AP is so good under normal circumstances that heavy AP users eventually get to the point where they trust the system too much, and their own abilities to remain vigilant and ready to take over quickly diminish.

I don't understand how people don't get that. You can debate until the cows come home about the drivers responsibility. Yes, it is always the drivers duty to be in control of the vehicle. Turing on Autopilot cedes physical control and then requires mental control to safely drive the vehicle. That metal sharpness fades with repeated use of AP. It's like my teenage daughter that has trouble with simple math calculations because her phone does all of the work most times.
I did not buy EAP, and will not until such time that the system can handle not killing me via decapitation when a freaking 60 foot tractor trailer crosses in front of me.
It's cool to play around with, and I have. It has also scared me more than once.
Everyone reading this probably has been scared at some point.
Look, I love Tesla and wish them the best getting FSD to market first.
I just don't want to die helping them get there.
Skills atrophy is a well known phenomenon in the aviation industry due to automation. I am sure car automation will cause the same problems. Not to said it shouldn't be used though...just has be very reliable and drivers will have manually drive on a regular base.
 
  • Like
Reactions: afadeev
"What, do you feel, makes AP deadly?"

Skills atrophy. That combined with a basic inability of the system to see large stationary objects such as trucks, freeway lane barriers, stopped cars, etc. AP is so good under normal circumstances that heavy AP users eventually get to the point where they trust the system too much, and their own abilities to remain vigilant and ready to take over quickly diminish.

I don't understand how people don't get that. You can debate until the cows come home about the drivers responsibility. Yes, it is always the drivers duty to be in control of the vehicle. Turing on Autopilot cedes physical control and then requires mental control to safely drive the vehicle. That metal sharpness fades with repeated use of AP. It's like my teenage daughter that has trouble with simple math calculations because her phone does all of the work most times.
I did not buy EAP, and will not until such time that the system can handle not killing me via decapitation when a freaking 60 foot tractor trailer crosses in front of me.
It's cool to play around with, and I have. It has also scared me more than once.
Everyone reading this probably has been scared at some point.
Look, I love Tesla and wish them the best getting FSD to market first.
I just don't want to die helping them get there.
While I don't disagree with most of your arguments I wonder if even a skilled human driver could have avoided crashing into that truck that entered the road? I simply don't know enough about what happened to have an opinion in that accident. I do however know of someone that died in what sounds like a similar accident and it was many years before most cars even had cruise control much less AP. I contend there will always be scenarios where the most astute, cautious, and skilled driver will die in an automobile crashes. The real question is in the future will things like real FSD have value in that it saves many more lives than it takes?
 
While I don't disagree with most of your arguments I wonder if even a skilled human driver could have avoided crashing into that truck that entered the road? I simply don't know enough about what happened to have an opinion in that accident. I do however know of someone that died in what sounds like a similar accident and it was many years before most cars even had cruise control much less AP. I contend there will always be scenarios where the most astute, cautious, and skilled driver will die in an automobile crashes. The real question is in the future will things like real FSD have value in that it saves many more lives than it takes?
It was on an open road in good lighting conditions that that accident occurred. Any attentive driver could and would have avoided it.
 
But, would the driver have been attentive without autopilot? Whatever happened to cause him to be distracted for 10 seconds after engaging autopilot, would that still have caused the driver to be distracted without AP? We can’t know, but I don’t think it’s safe to assume either way.
 
But, would the driver have been attentive without autopilot? Whatever happened to cause him to be distracted for 10 seconds after engaging autopilot, would that still have caused the driver to be distracted without AP? We can’t know, but I don’t think it’s safe to assume either way.
That is not realistic. I see drivers on cells phones all the time. What I see is them looking down and then back up every 2-3 seconds. For someone to not look at the road for 10 full seconds tells me he was either passed out (medical condition) or so used to autopilot working correctly that he just got on his phone and did not look up for 10 seconds straight.
Occams razor tells me the later is probably the correct answer.
 
That is not realistic. I see drivers on cells phones all the time. What I see is them looking down and then back up every 2-3 seconds. For someone to not look at the road for 10 full seconds tells me he was either passed out (medical condition) or so used to autopilot working correctly that he just got on his phone and did not look up for 10 seconds straight.
Occams razor tells me the later is probably the correct answer.

Sure. Or maybe his phone fell into the passenger footwell. Maybe he was talking to someone or following navigation on his phone and needed it back. Maybe he would have leaned over and reached for it regardless, and thought it would be safer to put on AP first.
 
  • Disagree
Reactions: afadeev
What, do you feel, makes AP deadly?

In order for the car to crash into something, the driver has to allow it to. The driver can override the steering, acceleration, and braking at any moment.

Others have brought up skills atrophy, which is definitely a consideration.

I would also suggest that competent and "safer" use of AP requires developing a brand new set of skills - attentive and focused monitoring of AP operation. It is a uniquely different skill set than driving, and without training (which is n/a), leads to over-relying on AP. That over-reliance leads to accidents, and an occasional death.

Over-reliance would be less of a problem if AP was 99+% successful at what it does, and if the remaining 1% of the failure modes were well documented and easy to understand and avoid.

As it is, we are jointly figuring out the conditions that trigger that 1% error states (or 2%, or 5% - who knows), so that we can proactively mitigate them.

I find the need to undertake this task highly undesirable, and an outcome of Tesla's irresponsible lack of communication.

If Tesla were to come forth and share the scenarios where AP is expected to struggle, and how ongoing updates change the probability of triggering those error conditions, my confidence and comfort with AP would be significantly higher. As it is, it's dropping with every public death while over-relying on AP.

In pretty much any other car, if the driver relinquishes control to the car, it will be soon be in the ditch or head on into another car. Does that not make AP less deadly?

In any other car, relinquishing control of the car would be an insanely stupid and irresponsible thing to do.

In a Tesla on AP, it is the expected thing to do.
I know you can counter-claim that Tesla manual says to never take the eyes of the road and remain in full control. But it does not work like that in real life, with our without disclaimers. I can't drive ON AP with my hands on the steering wheel without getting in the way of auto-steer operation and disabling it. So I have to keep my hands off the wheel. Same with the feet off the pedals. and relaxed in the resting position. That is a far better, and less stressful way to commute, but it also leads to less mental and physical alertness around driving tasks and more around everything else (home, work, etc). At least for me.

Once AP is engaged, the expected and desired action to take is to deploy a newly developed skill set of continuously monitoring and validating AP's performance without losing focus, or attention. That is not a skill set that any of us were born with, nor have had any practice developing, until now.

A system does not need to be 100% to be better than not having it.

AEB/ FCW are assist systems also. If the car has them and still crashes, that is the driver's fault, not the car's.

Airbags do not prevent 100% of fatalities, nor do seat belts. In some types of accidents airbags specifically do not deploy. In sone cases, the airbag deployment increases injury. However, airbags, on the whole, increase safety/ reduces risk of serious injury.

I agree with your logic, to a point.
I find EAB/FCW highly unreliable, and either turn them off in all my cars (all of them have it, in one form or another), or ignore them.

The key distinction to me is that airbags mitigate consequences of an accident.
AP is in another category of driver's aids, that attempt to prevent the accident in the first place.

TC (traction control) has been doing that in most cars for a few decades now. Drivers have learned to trust TC engagement, as it never fails, or suddenly goes berserk in a middle of a turn.

If my TC only worked 99% of the time, I would demand to know what the other 1% of the situations are, and how to avoid them. Else, I would find a way to permanently disable unreliable TC.

I have the same concerns with AP.

Over-relying on AP, which is inevitable, can increase the probability of an accident. I did not argue that it does, since I don't have the data (none of us do, and Tesla tight lipped), but without knowing those 1% use-cases, it I can't preclude that possibility that it would.

Again, if Tesla was upfront with sharing that info, I would really appreciate it, and enjoy driving on AP yet again.

a
 
Last edited:
The key distinction to me is that airbags, like ABS, mitigate consequences of an accident.

AP is in another category of driver's aids, that attempt to prevent the accident in the first place.

And this is the crux of the problem. There is no way to tell how many accidents AP prevents, so there is only a one sided measurement (the times it doesn't prevent an accident).
 
  • Like
Reactions: afadeev
By way of example, the airline industry recently has been pushing back on changes by the manufacturers, requiring training where there have been significant enhancements or modifications to the operation of aircraft.. it is probably a good idea for Tesla to consider adding ways to inform and to train owners of its use.

Maybe Tesla does not want to provide conflicting information, e.g. such training will contradict the marketing hype around AP and FSD?!?
 
  • Like
Reactions: afadeev
Give the rate of enhacements to the software, existing owners would likely benefit from similar workshops. Since the closing of the showroom, I have not heard of other workshops.

Does the car keep a log of all the OTA updates? Owners should be aware what get changed on the fly, e.g. any change that touches safety critical system.

Also fundamental safety issues associated with these fatal crashes are beyond workshops.
 
afadeev said:
AP is in another category of driver's aids, that attempt to prevent the accident in the first place.

And this is the crux of the problem. There is no way to tell how many accidents AP prevents, so there is only a one sided measurement (the times it doesn't prevent an accident).

Agreed with the above observation, but it's not the crux of the problem. It is the consequence of Tesla not sharing those AP performance data points, except in the form of the high-level self-congratulatory summaries.

To me, the crux of the problem is lack of data, and lack of communication from Tesla.
Without that data, my statistically insignificant observations are as follows:
  1. I haven't had an at-fault accident in decades before getting a Tesla, or since. Therefore, the objective marginal safety value of Tesla's AP, to me, is zero.
  2. The convenience and commuting comfort values are clearly positive. But at a price. The less focused I am on driving and supervising AP, the greater the convenience and commuting comfort benefits!
  3. My early over-reliance on AP has led me into far too many emergency disengagements, or "oh sh*t" situations, in which AP would have caused an accident had I not intervened. Thus the subjective net safety impact on my driving experience has initially been marginally negative, though I'm learning to mitigate that by reducing my over-reliance on AP.
  4. I'm learning to pro-actively avoid "AP fail" conditions with the help of personal experience, and through shared experience from folks on this and other forums, but with zero help or instruction from Tesla.
So far, AP's value proposition to me is a trade-off: improved commuting comfort at the expense of marginal subjective decrease in safety. I am working on maintaining the former while mitigating the latter, but it's very much a work in progress.

A counter-argument to the above would be: AP is the savior, and an average Tesla driver would have crashed long ago had they not been driving on AP. AP is a blessing, and anyone who argues otherwise is a TSLA short.

May be that's true. Then again, we don't have the data to argue one way or the other.

Maybe Tesla does not want to provide conflicting information, e.g. such training will contradict the marketing hype around AP and FSD?!?

Very likely to be true.

Training sessions also cost money.
I would actually pay out of pocket to learn about AP limitations and how to best avoid them, but that is presently not on the menu.


a
 
  • Like
Reactions: GolanB and Kant.Ing
probably meets that definition.
Limited access is- on ramp/off ramp Your 280, 680, 880. Those are limited access highways for using AP.

SR7/441 is like a wide, multi lane, rural road with high usage in parts and cross traffic/no signals of all types to cars, tractors, tractor trailers. Not a candidate for AP in AP's current configuration or FSD currently nowhere.
 
Last edited:
My early over-reliance on AP has led me into far too many emergency disengagements, or "oh sh*t" situations, in which AP would have caused an accident had I not intervened. Thus the subjective net safety impact on my driving experience has initially been marginally negative, though I'm learning to mitigate that by reducing my over-reliance on AP.

Have you seen this report: http://www.safetyresearch.net/Library/NHTSA_Autosteer_Safety_Claim.pdf - They attempted to replicate the results of the NHTSA study concerning the number of airbag deployments after the introduction of autosteering in their vehicles, and in so doing, came to the opposite conclusion the NHTSA originally did:

The model estimated from these specific data helps to answer the question concerning NHTSA’s safety claim about Autosteer, “Is the installation of Autosteer associated with a decreased risk of an airbag deployment crash, controlling for exposure mileage?” The answer is “No.” Table 1 demonstrates that Autosteer is actually associated with an increase in the odds ratio of airbag deployment by more than a factor of 2.4

Tesla has stated that they are opposed to the public release of their data because of the risk that the data may be interpreted negatively. Whether this research is an example of that, or whether it is another demonstration as to why the data should be made public, I can't say for sure.

That said, I don't see how anything other than transparency can be a long-term strategy, given the increased reliance on automation, and their desire to be approved for higher levels of driving automation. Their data will need to be examined by many regulatory approval bodies internationally, and one would hope that their conclusions would be based on sound a discovery, and not later brought into question.

Given the high profile nature of the Boeing incident and what we are coming to learn, I don't think anything other than transparency will work here -- even the regulatory bodies need oversight.

For now, all we have is our own personal subjective views, but so far it's gotten us here.
 
Limited access is- on ramp/off ramp Your 280, 680, 880. Those are limited access highways for using AP.

SR7/441 is like a wide, multi lane, rural road with high usage in parts and cross traffic/no signals of all types to cars, tractors, tractor trailers. Not a candidate for AP in AP's current configuration or FSD currently nowhere.
Maybe, but that's not what the manual says. According to your definition you shouldn't be using Autopilot on most highways away from urban areas (e.g. here in CA the US101 or I-5 also have cross traffic and the other characteristics you mention in more rural areas).
 
Maybe, but that's not what the manual says. According to your definition you shouldn't be using Autopilot on most highways away from urban areas (e.g. here in CA the US101 or I-5 also have cross traffic and the other characteristics you mention in more rural areas).


It's not his definition, it's the actual definition.

Here's Consumer Reports making the same point-

Tesla Driver in Fatal March Crash Was Using Autopilot, NTSB Says

CR said:
in both cases, the cars were on highways driving at high rates of speed. Neither driver was on a true limited-access highway, like an interstate; both had intersections that allowed access to cross traffic.

True limited access highways are what AP is intended to be used on. It's not designed or intended to handle cross-traffic. At all.
 
Have you seen this report: http://www.safetyresearch.net/Library/NHTSA_Autosteer_Safety_Claim.pdf - They attempted to replicate the results of the NHTSA study concerning the number of airbag deployments after the introduction of autosteering in their vehicles, and in so doing, came to the opposite conclusion the NHTSA originally did:


FWIW far as I can tell the "group" behind that study is basically one dude and his wife, who do "studies" for people involved in class action lawsuits against car companies.... (Ford tire lawsuits for example)

So between that background and the fact they got to their conclusions by tossing out most of the data NHTSA used, I'm....dubious as to their results.


That said- NHTSAs results aren't very useful either as their own sample data was a pretty small set, and was AP1 so fairly irrelevant today anyway.
 
  • Informative
Reactions: jerry33
Maybe, but that's not what the manual says. According to your definition you shouldn't be using Autopilot on most highways away from urban areas (e.g. here in CA the US101 or I-5 also have cross traffic and the other characteristics you mention in more rural areas).
Your right 101 has many areas with no off ramp, you get a tractor trailer crossing your path and your on AP and no hands on the wheel, looking in the backseat for something expect the worst. Get your will updated, because full attention and hands on the wheel at all times are the prescription even with on and off ramps.

Sorry, to be the bearer of bad news. If thats bad news. It really isn't, its called driving, which currently we should all be doing. AP is assistive even currently even if you bought FSD.
 
Last edited:
Status
Not open for further replies.