Separate names with a comma.
Discussion in 'Tesla, Inc.' started by bro1999, Nov 10, 2016.
Tesla's own numbers show Autopilot has higher crash rate than human drivers
I would say that is an unfair comparison. Sample size is too small as the author states.
In addition, the single autopilot failure was not a normal crash. Autopilot isn't intended to be a self-driving system - just driver assistance like autopilot on an airplane. If the driver was looking at the road they would have hit the brakes before it hit the truck.
If a pilot of airplane left a plane unattended would you say the crash was the fault of the autopilot system or the pilot?
I would say it's entirely plausible largely because of owners disregarding the caveats.
I like your unattended plane autopilot analogy.
From the numbers in the article, Tesla's 2 to 4 times safer than the average car. US fatality rate is about 1 per 100 million miles.
Tesla wins again.
This is interesting. Reading many threads in many forums, there seem to be two types of AP owners -
"I treat it as an aid, it's fantastic - as an aid but has limitations I understand", and
"I rely on it to think for me in many situations, and it sucks in many situations"
The danger, I suspect, is that it is easy to become complacent - it copes 80-90% of the time, so gradually you get less aware as your trust of the system inevitably increases (modulo having a really bad experience early on).
Going to be interesting. It was a definite safety factor in my purchase decision - but the strength and structure of the vehicle was the major safety factor for me.
I'm guessing the demographic that can purchase (and thus drive/crash) a Tesla skews damn near everything.
Where's the 16 year old in a $500 civic? Where's the 90 year old in a 22 foot long buick who can't see over the wheel?
With statistics involving one (1) sample, you can do a lot of fun things. Think of all that the new U.S. President-Elect has said or done which no candidate before him had ever said or done. So it is since yesterday statistically 100% certain that you will win the U.S. presidential elections if you .... (insert according to inspiration) . That's not how statistics work (or at least not how they become of any interest).
I love data. Done correctly you can spin just about any argument.
I had a exchange where the person stated, "what about a single lane country road in the rain at night. It will not see a broken down scooter on the side of the road." Then he added something about being dangerous. I replied, "your cruise control will drive you into a brick wall and you still use it." He stopped arguing. I did offer him his first ride in a model S - that he turned down.
I would bet autonomous vehicles happen quicker than people learn how to use autopilot.
Elon should not be announcing how statistics indicate Autopilot Teslas are more safe than human-driven cars then. He can't have his cake and eat it too.
It's a calculated risk. He's only trying to demonstrate it's still safer than your average car and doesn't deserve the amount of media attention it gets (relative to the thousands of other accidents that happen). Elon's statistic also has another nuance in that he's including the autopilot safety features (which are always on even if you didn't activate autopilot itself), not just the autopilot mode. Comparing cars with autopilot hardware and without autopilot hardware may also skew this.
Of course the biggest negative and risk is that inevitably there will be another death and the statistics will be skewed immediately due to the extremely small sample size (we had the same discussion when taking about the fire incidents).
The further you are from knowing the source of the data, the less reliable it is trying to use the data in comparisons. As the author stated, he doesn't know lots of things about the data that has been thrown around, and is getting it from lots of different sources... so, about the only thing that we can say based on the numbers he used is... "I have no idea if autopilot is safer or not".
I would not be surprised if it wasn't safer statistically at this point based on the number of people who seem to think it means you can take a nap, or ignore the road while driving,but that doesn't mean it cant be safer, or would not be safer if used as intended (which I do totally believe, without proof though). And as it gets better and better, I cannot image any way that it would not end up significantly better than human drivers, when used as intended.
Also Autopilot miles are at a much higher speed per mile. You are much less likely to die in a crash doing 15mph that the majority of non-autopilot miles are at.
I believe the fatality rate for AP is worse than this article concludes, simply because the number of non-autopilot fatalities includes "non traffic" related deaths, such as suicides, stolen car chases, and high speed pursuits. By my reckoning, there are three non-AP fatalities that occurred in the course of normal driving, and two AP fatalities (one in China) that occurred in the course of normal driving. That makes the per-mile AP fatality rate 9x higher than the non-AP rate.
Furthermore, the samples are statistically valid, because the "sample size" is way more than two or three. Imagine you had a biased, trick coin which turned up heads (a fatality) only one in a million times. A news article comes out which claims that the coin came up heads. Does that mean it was flipped only once? Was the sample size only one? No; it was probably flipped around a million times, for a sample size of one million. It's the same with fatalities: there are many chances to have a fatality, but most of the time they are avoided (the coin comes up tails). But just because a fatality was avoided doesn't mean the coin wasn't flipped.
Let's say a chance to make a fatal mistake comes up every thousand miles of driving: bad weather, a multi-car accident, sunlight, driver health problems, blown tire, road debris, slick road, aggressive driving, sleepy truck driver, etc. That's 3.2 million chances for a fatality over all Tesla miles driven (3.2 billion). That's a lot of coin flips, and the observed fatality rate can tell you a lot about the "real" fatality rate.
Now you are going with even more arbitrary statistics. If you do that you have to split the "non-AP" driving into "traffic related" driving and "non-traffic related" driving (the miles traveled during suicides, stolen car incidences, high speed pursuits, and any other arbitrary criteria you put).
I think what was mentioned above (that AP can only be activated in higher speed driving) makes much more sense if you are going to play those kinds of games. If you leave out all driving in other conditions, then the statistics may turn out way different (as mentioned, it's pretty unlikely to have a fatality at 15mph).
Also, the fatality in China had not been confirmed to be autopilot related, because the family refuses to let Tesla access the vehicle (which was likely also why it was left out of the report). So officially there is only 1 AP fatality.
Perhaps sample size is the wrong term (not a statistician, an actual statitician should speak up about this), but with a single incident the likelihood of it being an outlier is quite high.
Outlier - Wikipedia
The time in which it occurred is also a factor. For example in your coin example, what if in just 100k flips your coin had a heads, then it would seem the probability of heads is 1/100,000, even though the "real" probability is 1 in a million.
There's probably also a false positive paradox in this too with a single incident. Basically would the same driver also get into an accident if he was using plain cruise control?
False positive paradox - Wikipedia