Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
Right. Unless I misunderstood, the driver walked away from the crash, so not seeing why the airbags needed to deploy. It seems some people have the expectation that airbags should deploy in every crash.

People don't understand that airbags are supplementary not primary safety systems. Airbag deployment is really expensive to fix and usually causes minor injuries too, so it's nice they don't go off unless needed.
 
  • Informative
Reactions: SW2Fiddler
For collisions, we don't have enough public data. However, we can judge safety as listed by fatalities: one fatality per two billion miles when Autopilot is turned off, and one fatality per 130 million miles when Autopilot is turned on. This is fairly strong evidence that Autopilot does not prevent fatalities.

Returning to collisions on "restricted roads": Elon has claimed the accident rate on Autopilot is halved. However, the normal collision rate for cars is roughly 400 per hundred million miles travelled (https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812032). Half that (200) times the number of Autopilot miles driven (130 million) yields 260 collisions. Thus, even if this claim were true, we should have 260 collisions as data, but we've heard of only a handful. The other 255+ were not picked up by the press, for whatever reason (no police report, in a foreign country, driver doesn't blame Autopilot, Tesla non-disclosure agreements). Similarly, Autopilot could have prevented a huge number of collisions that we haven't heard about.

So, we have no way to judge, via statistics, whether Autopilot on all "restricted" roads prevents more collisions than non-Autopilot.

However, Tesla does have these statistics, and we do know that Tesla restricts Autopilot speeds on non-divided highways ("restricted" roads). Why? For safety, obviously. This is very, very strong evidence that Tesla believes Autopilot to be riskier on non-divided highways than on divided highways. If they had statistics that proved that Autopilot was as safe on restricted roads, they would have left AP speed up to the discretion of the driver, as they do on highways.

And we have corroborating evidence: we have seen two drivers on AP driving on restricted roads that did not prevent accidents which any normal driver could have prevented. The only AP fatality seen so far was on a highway in which vehicles are permitted to cross traffic. We have evidence that, in these accidents, Autopilot drove terribly. We have warnings like "There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian" listed in the manual. And lastly, we have the owner's manual stating outright that "Autopilot is intended for use on divided highways."

Finally, we have not heard much about Autopilot accident prevention on non-divided highways. At best, we have one instance of AEBS activating, which is always on and distinct from Autopilot.

Consumer Reports agrees with this view. It cites an NHTSA study that claims 3-17 secs delay in taking over control from semi autonomous car. The report also talks about few other issues with semi-autonomous driving, that I raised in the past.
Tesla's Autopilot: Too Much Autonomy Too Soon
Regaining Control
Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer's control. At 65 mph, that's between 100 feet and quarter-mile traveled by a vehicle effectively under no one's control.
 
Last edited:
For collisions, we don't have enough public data. However, we can judge safety as listed by fatalities: one fatality per two billion miles when Autopilot is turned off, and one fatality per 130 million miles when Autopilot is turned on. This is fairly strong evidence that Autopilot does not prevent fatalities.

Returning to collisions on "restricted roads": Elon has claimed the accident rate on Autopilot is halved. However, the normal collision rate for cars is roughly 400 per hundred million miles travelled (https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812032). Half that (200) times the number of Autopilot miles driven (130 million) yields 260 collisions. Thus, even if this claim were true, we should have 260 collisions as data, but we've heard of only a handful. The other 255+ were not picked up by the press, for whatever reason (no police report, in a foreign country, driver doesn't blame Autopilot, Tesla non-disclosure agreements). Similarly, Autopilot could have prevented a huge number of collisions that we haven't heard about.

So, we have no way to judge, via statistics, whether Autopilot on all "restricted" roads prevents more collisions than non-Autopilot.

However, Tesla does have these statistics, and we do know that Tesla restricts Autopilot speeds on non-divided highways ("restricted" roads). Why? For safety, obviously. This is very, very strong evidence that Tesla believes Autopilot to be riskier on non-divided highways than on divided highways. If they had statistics that proved that Autopilot was as safe on restricted roads, they would have left AP speed up to the discretion of the driver, as they do on highways.

And we have corroborating evidence: we have seen two drivers on AP driving on restricted roads that did not prevent accidents which any normal driver could have prevented. The only AP fatality seen so far was on a highway in which vehicles are permitted to cross traffic. We have evidence that, in these accidents, Autopilot drove terribly. We have warnings like "There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian" listed in the manual. And lastly, we have the owner's manual stating outright that "Autopilot is intended for use on divided highways."

Finally, we have not heard much about Autopilot accident prevention on non-divided highways. At best, we have one instance of AEBS activating, which is always on and distinct from Autopilot.


You're kidding, right? The number of autopilot fatalities (one) is so small that there's no way to draw statistical conclusions. In addition, the fatality rate, in the United States, is 1.1 deaths per 100 million miles (2013 data), not two billion.
 
  • Love
  • Like
Reactions: madodel and Hengist
The car did not ask him to hold the steering wheel when the accident happen. It was still in the "within" 2 minutes range. (He did told police his hand wasn't on steering wheel because is on AP). He addressed if he did not put hand on the steering wheel, car will beep again in like 10-20 seconds and turn on emergency flashlight, and slow down to park somewhere. Apparently his car was in the regular AP driving mode when it happened. Tesla's official response emphasized the rule about warning to hold the steering wheel, does not apply the to moment when the accident happen, ACCORDING TO THE DRIVER


This sounds like he might have a misunderstanding of AP. You are supposed to have your hands on the wheel at all times according to Tesla, not only when the warning appears.

Model S Autopilot Press Kit | Tesla Motors

Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.
 
  • Like
Reactions: madodel
...I do worry about that a little. If I was involved in an accident that got picked up by the media for whatever reason (via police report, etc), I wouldn't be too keen on Tesla making statements about what I did or did not do to the press.

First of all, Tesla has not released any name of all these Autopilot drivers except for the deceased one.

Secondly, Tesla does not routinely disclose accidents.

They only comment about accidents when either owners or the public say something negatively that is contrary to the cars' safe performance.

If you don't like this kind of potential exposure, then Tesla is not for you.
 
You're kidding, right? The number of autopilot fatalities (one) is so small that there's no way to draw statistical conclusions. In addition, the fatality rate, in the United States, is 1.1 deaths per 100 million miles (2013 data), not two billion.
Dude, that's twisting statistics to ludicrous mode! Why don't you compare non-AP Tesla miles vs. AP tesla miles, or compare with the fatality rates of luxury large sedans? Your stat is nonsense.
 
This sounds like he might have a misunderstanding of AP. You are supposed to have your hands on the wheel at all times according to Tesla, not only when the warning appears.

Model S Autopilot Press Kit | Tesla Motors

Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

No Sir/Madam. No hands needed on wheel. See uncle Lebeau driving hands free in these Tesla commercials shown on CNBS. First few lines:
"Tesla is giving those who own a Model S or Model X the ability to take their hands off the wheel and let their vehicle take control when driving in certain conditions."
First look at Tesla's autopilot technology
 
No Sir/Madam. No hands needed on wheel. See uncle Lebeau driving hands free in these Tesla commercials shown on CNBS.
First look at Tesla's autopilot technology
@srini - Tesla doesn't make any commercials. Whatever you find there is likely work of the media. And the ability to take hands off doesn't mean that you throw caution out the window.
@KaiserSoze - you can twist the stats whatever way you want to but the fact is AP doesn't make driving a Tesla less safer in any way.
@Az_Rael - I echo what you said about the publicity that Tesla accidents especially Model x get. I fear when one of my family members is driving the X and if they were to get in an accident it will be worldwide coverage and any word they say will be scrutinized and debated all over. And @Tam - I didn't buy a Tesla to get my family's life opened to everyone.
 
No Sir/Madam. No hands needed on wheel. See uncle Lebeau driving hands free in these Tesla commercials shown on CNBS. First few lines:
"Tesla is giving those who own a Model S or Model X the ability to take their hands off the wheel and let their vehicle take control when driving in certain conditions."
First look at Tesla's autopilot technology

And that encapsulates the problem nicely. The manuals and caution statements all say "hands on wheel!", but they didn't tell the Tesla marketing folks that. No corrections were made by Tesla to that article that I am aware of.

And then we get PR releases admonishing users that their hands must be on the wheel when a few high profile cases go south.

Will be interesting to see the upcoming blog post on how to "properly" use Autopilot.
 
Dude, that's twisting statistics to ludicrous mode! Why don't you compare non-AP Tesla miles vs. AP tesla miles, or compare with the fatality rates of luxury large sedans? Your stat is nonsense.

Don't understand your point here. One single fatality out of 130 million miles driven in AP mode is meaningless. Watch as I take my coin out of my pocket and flip it ... oh, heads, looks like there's a 100% chance of heads every time I flip a coin.

If it's the 1.1 fatalities for every 100 million driven, then you're simply agreeing with the point that there's not enough data to claim that autopilot is less safe ... you'd have to control for average age of a Tesla driver, size/weight of vehicle, driving habits, etc. I look forward to you funding the study.
 
You're kidding, right? The number of autopilot fatalities (one) is so small that there's no way to draw statistical conclusions. In addition, the fatality rate, in the United States, is 1.1 deaths per 100 million miles (2013 data), not two billion.

Two billion miles driven comes from Tesla's published numbers. Only one non-traffic fatality has occurred in those two billion miles driven. For Autopilot, it's one fatality per 130 million miles.

You can't compare AutoPilot to U.S. fatality rates, as Teslas are much more advanced, safety-wise, than the general U.S. car population. Much larger crumple zones, automatic emergency braking, strong aluminum shell, more airbags, etc. You must compare Tesla to Tesla.

There's a great binomial confidence calculator online here: Epi Tools - Calculate confidence limits for a sample proportion. Enter in your own assumptions and confidence levels and see what kind of confidence intervals you get.

Examples:
http://epitools.ausvet.com.au/conte...pleSize=130000&Positive=1&Conf=0.85&Digits=12

and:

http://epitools.ausvet.com.au/conte...leSize=2000000&Positive=1&Conf=0.85&Digits=12

You have to pick the number of trials--that is, the number of times a human driver or Autopilot could have produced a fatality, but didn't. Bad weather, merges, debris, traffic, construction, idiots, poor road markings, etc. Above, I've picked one trial for every thousand miles. Above, the confidence intervals don't intersect, but it's only at a CI of 0.85. I was bummed about that, but then realized that I'm fine with it for practical decision making.
 
...I didn't buy a Tesla to get my family's life opened to everyone.

Privacy is a very legitimate concern indeed.

When you buy a Tesla, you have got to understand that your car is logged 24/7.

The best way to keep your privacy is the absence of logging. No logging means no one can reconstruct your detailed timeline.

Thus, I have no answer when you are a Tesla owner because those cars have all kinds of sensors and can log minute details as shown in Top Gear Roadster incidence, New York Times Model S reporter Broder, first Model S lemon lawsuit that logged the frunk was opened with fuse box tampered...
 
Last edited:
Two billion miles driven comes from Tesla's published numbers. Only one non-traffic fatality has occurred in those two billion miles driven. For Autopilot, it's one fatality per 130 million miles.

You can't compare AutoPilot to U.S. fatality rates, as Teslas are much more advanced, safety-wise, than the general U.S. car population. Much larger crumple zones, automatic emergency braking, strong aluminum shell, more airbags, etc. You must compare Tesla to Tesla.

There's a great binomial confidence calculator online here: Epi Tools - Calculate confidence limits for a sample proportion. Enter in your own assumptions and confidence levels and see what kind of confidence intervals you get.

Examples:
http://epitools.ausvet.com.au/conte...pleSize=130000&Positive=1&Conf=0.85&Digits=12

and:

http://epitools.ausvet.com.au/conte...leSize=2000000&Positive=1&Conf=0.85&Digits=12

You have to pick the number of trials--that is, the number of times a human driver or Autopilot could have produced a fatality, but didn't. Bad weather, merges, debris, traffic, construction, idiots, poor road markings, etc. Above, I've picked one trial for every thousand miles. Above, the confidence intervals don't intersect, but it's only at a CI of 0.85. I was bummed about that, but then realized that I'm fine with it for practical decision making.

I've said it before and will say it again: the only proper methodological approach is: compare AP equipped otherwise identical cars (battery/performance trim) with and without AP over all miles driven (with and without AP enabled) and you'll get a "true" (real world) measure of if AP equppied cars are safer. Sort of like an "intention to treat" analysis if that means anything to anyone.
 
Two billion miles driven comes from Tesla's published numbers. Only one non-traffic fatality has occurred in those two billion miles driven. For Autopilot, it's one fatality per 130 million miles.

Do we really know how many fatalities there have been in a Tesla? Does Tesla even know that number?

And what is a "non-traffic" fatality? I know of at least 5 drivers that have died in a Tesla accident/collision. As far as I know only one of them was using AP at the time.