You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Right. Unless I misunderstood, the driver walked away from the crash, so not seeing why the airbags needed to deploy. It seems some people have the expectation that airbags should deploy in every crash.Airbag deployment needs enough G:s. Apparently there was not that much deceleration.
Right. Unless I misunderstood, the driver walked away from the crash, so not seeing why the airbags needed to deploy. It seems some people have the expectation that airbags should deploy in every crash.
As for why people trust Tesla's claims with the logs, people just tend to trust what is written.
For collisions, we don't have enough public data. However, we can judge safety as listed by fatalities: one fatality per two billion miles when Autopilot is turned off, and one fatality per 130 million miles when Autopilot is turned on. This is fairly strong evidence that Autopilot does not prevent fatalities.
Returning to collisions on "restricted roads": Elon has claimed the accident rate on Autopilot is halved. However, the normal collision rate for cars is roughly 400 per hundred million miles travelled (https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812032). Half that (200) times the number of Autopilot miles driven (130 million) yields 260 collisions. Thus, even if this claim were true, we should have 260 collisions as data, but we've heard of only a handful. The other 255+ were not picked up by the press, for whatever reason (no police report, in a foreign country, driver doesn't blame Autopilot, Tesla non-disclosure agreements). Similarly, Autopilot could have prevented a huge number of collisions that we haven't heard about.
So, we have no way to judge, via statistics, whether Autopilot on all "restricted" roads prevents more collisions than non-Autopilot.
However, Tesla does have these statistics, and we do know that Tesla restricts Autopilot speeds on non-divided highways ("restricted" roads). Why? For safety, obviously. This is very, very strong evidence that Tesla believes Autopilot to be riskier on non-divided highways than on divided highways. If they had statistics that proved that Autopilot was as safe on restricted roads, they would have left AP speed up to the discretion of the driver, as they do on highways.
And we have corroborating evidence: we have seen two drivers on AP driving on restricted roads that did not prevent accidents which any normal driver could have prevented. The only AP fatality seen so far was on a highway in which vehicles are permitted to cross traffic. We have evidence that, in these accidents, Autopilot drove terribly. We have warnings like "There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian" listed in the manual. And lastly, we have the owner's manual stating outright that "Autopilot is intended for use on divided highways."
Finally, we have not heard much about Autopilot accident prevention on non-divided highways. At best, we have one instance of AEBS activating, which is always on and distinct from Autopilot.
Regaining Control
Research shows that humans are notoriously bad at re-engaging with complex tasks after their attention has been allowed to wander. According to a 2015 NHTSA study (PDF), it took test subjects anywhere from three to 17 seconds to regain control of a semi-autonomous vehicle when alerted that the car was no longer under the computer's control. At 65 mph, that's between 100 feet and quarter-mile traveled by a vehicle effectively under no one's control.
For collisions, we don't have enough public data. However, we can judge safety as listed by fatalities: one fatality per two billion miles when Autopilot is turned off, and one fatality per 130 million miles when Autopilot is turned on. This is fairly strong evidence that Autopilot does not prevent fatalities.
Returning to collisions on "restricted roads": Elon has claimed the accident rate on Autopilot is halved. However, the normal collision rate for cars is roughly 400 per hundred million miles travelled (https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812032). Half that (200) times the number of Autopilot miles driven (130 million) yields 260 collisions. Thus, even if this claim were true, we should have 260 collisions as data, but we've heard of only a handful. The other 255+ were not picked up by the press, for whatever reason (no police report, in a foreign country, driver doesn't blame Autopilot, Tesla non-disclosure agreements). Similarly, Autopilot could have prevented a huge number of collisions that we haven't heard about.
So, we have no way to judge, via statistics, whether Autopilot on all "restricted" roads prevents more collisions than non-Autopilot.
However, Tesla does have these statistics, and we do know that Tesla restricts Autopilot speeds on non-divided highways ("restricted" roads). Why? For safety, obviously. This is very, very strong evidence that Tesla believes Autopilot to be riskier on non-divided highways than on divided highways. If they had statistics that proved that Autopilot was as safe on restricted roads, they would have left AP speed up to the discretion of the driver, as they do on highways.
And we have corroborating evidence: we have seen two drivers on AP driving on restricted roads that did not prevent accidents which any normal driver could have prevented. The only AP fatality seen so far was on a highway in which vehicles are permitted to cross traffic. We have evidence that, in these accidents, Autopilot drove terribly. We have warnings like "There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian" listed in the manual. And lastly, we have the owner's manual stating outright that "Autopilot is intended for use on divided highways."
Finally, we have not heard much about Autopilot accident prevention on non-divided highways. At best, we have one instance of AEBS activating, which is always on and distinct from Autopilot.
...I asked my friend driver again. As off today, tesla did not contact him in any way....
...It kinda shocked us in the Wechat message group that driver said non of any air bag was deployed...
The car did not ask him to hold the steering wheel when the accident happen. It was still in the "within" 2 minutes range. (He did told police his hand wasn't on steering wheel because is on AP). He addressed if he did not put hand on the steering wheel, car will beep again in like 10-20 seconds and turn on emergency flashlight, and slow down to park somewhere. Apparently his car was in the regular AP driving mode when it happened. Tesla's official response emphasized the rule about warning to hold the steering wheel, does not apply the to moment when the accident happen, ACCORDING TO THE DRIVER
...I do worry about that a little. If I was involved in an accident that got picked up by the media for whatever reason (via police report, etc), I wouldn't be too keen on Tesla making statements about what I did or did not do to the press.
If you don't like this kind of potential exposure, then Tesla is not for you.
Dude, that's twisting statistics to ludicrous mode! Why don't you compare non-AP Tesla miles vs. AP tesla miles, or compare with the fatality rates of luxury large sedans? Your stat is nonsense.You're kidding, right? The number of autopilot fatalities (one) is so small that there's no way to draw statistical conclusions. In addition, the fatality rate, in the United States, is 1.1 deaths per 100 million miles (2013 data), not two billion.
This sounds like he might have a misunderstanding of AP. You are supposed to have your hands on the wheel at all times according to Tesla, not only when the warning appears.
Model S Autopilot Press Kit | Tesla Motors
Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.
@srini - Tesla doesn't make any commercials. Whatever you find there is likely work of the media. And the ability to take hands off doesn't mean that you throw caution out the window.No Sir/Madam. No hands needed on wheel. See uncle Lebeau driving hands free in these Tesla commercials shown on CNBS.
First look at Tesla's autopilot technology
No Sir/Madam. No hands needed on wheel. See uncle Lebeau driving hands free in these Tesla commercials shown on CNBS. First few lines:
"Tesla is giving those who own a Model S or Model X the ability to take their hands off the wheel and let their vehicle take control when driving in certain conditions."
First look at Tesla's autopilot technology
Dude, that's twisting statistics to ludicrous mode! Why don't you compare non-AP Tesla miles vs. AP tesla miles, or compare with the fatality rates of luxury large sedans? Your stat is nonsense.
You're kidding, right? The number of autopilot fatalities (one) is so small that there's no way to draw statistical conclusions. In addition, the fatality rate, in the United States, is 1.1 deaths per 100 million miles (2013 data), not two billion.
...I didn't buy a Tesla to get my family's life opened to everyone.
Two billion miles driven comes from Tesla's published numbers. Only one non-traffic fatality has occurred in those two billion miles driven. For Autopilot, it's one fatality per 130 million miles.
You can't compare AutoPilot to U.S. fatality rates, as Teslas are much more advanced, safety-wise, than the general U.S. car population. Much larger crumple zones, automatic emergency braking, strong aluminum shell, more airbags, etc. You must compare Tesla to Tesla.
There's a great binomial confidence calculator online here: Epi Tools - Calculate confidence limits for a sample proportion. Enter in your own assumptions and confidence levels and see what kind of confidence intervals you get.
Examples:
http://epitools.ausvet.com.au/conte...pleSize=130000&Positive=1&Conf=0.85&Digits=12
and:
http://epitools.ausvet.com.au/conte...leSize=2000000&Positive=1&Conf=0.85&Digits=12
You have to pick the number of trials--that is, the number of times a human driver or Autopilot could have produced a fatality, but didn't. Bad weather, merges, debris, traffic, construction, idiots, poor road markings, etc. Above, I've picked one trial for every thousand miles. Above, the confidence intervals don't intersect, but it's only at a CI of 0.85. I was bummed about that, but then realized that I'm fine with it for practical decision making.
Two billion miles driven comes from Tesla's published numbers. Only one non-traffic fatality has occurred in those two billion miles driven. For Autopilot, it's one fatality per 130 million miles.