Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
...and the same kind of disclaimer pops on navigation software noting that you, the driver, are ultimately responsible for evaluating the directions and picking a safe route. If Google tells you to turn left onto a road that has a 'road closed" sign across it, what would you do?

That disclaimer used to appear, most recently for newer features like biking directions. It no longer appears for any directions. But, to be clear, I was referring only to navigation correctness (determining whether I'm on the right road) and not determining whether it's safe for me to be on that road. Here, automation reduces what is necessary for my brain to hold in its working memory, so its less capable of answering questions about the route.

However, my intent is not to absolve drivers. The driver is still "ultimately responsible." Of that there is no question, because Autopilot is not fully autonomous.

My point is that Autopilot renders drivers less capable, even while they remain as responsible. And Tesla, for its part, has it's own, different kind of ultimate responsibility: to avoid negligent design of their products, by disabling Autopilot on road types where the statistics show it results in a net increase in accidents and fatalities.

I think the metaphor is broken on may levels, but taking the last part, if you drive a car you will eventually have an accident (be "bitten"). The question is whether your chances of eventually being bitten are higher or lower if you have AP as a tool at your disposal?

That depends on whether Autopilot is being used on a road that it is suitable for, or a road that it is not. On divided highways I would suspect that Autopilot, while still turning many people into zombies, reduces the accident rate. Autopilot works better here, and many people already zone out on the highway (the task is too boring), so having Autopilot also watching is a net win. When it comes to fatalities, not just accidents, the data is less clear, and hints at a different story.

On non-divided ("restricted") roads we have no distinct data sets to evaluate, but based on the poor showing of Autopilot in the most recent accidents, I personally believe that Autopilot is a net loss of safety. More net zombies created, less capable Autopilot.
 
My Wechat group friend went back to accident place took some photo where the accident happened. I don't know how to save the video from Wechat message, took screen shot attached. I'm a huge fan of tesla and Elon, own tesla stocks. The reason I'm posting here is want to let drivers to pay attention and learn from this accident. What what this photo shows, the road look in a condition I would enable AP. I have learned high way is still risky, but I will only do it on free way in future.

Actually the real lesson to be learned is TO ALWAYS BE LOOKING FORWARD when using AP and BE PREPARED TO TAKE OVER AT ANY TIME.
 
That disclaimer used to appear, most recently for newer features like biking directions. It no longer appears for any directions. But, to be clear, I was referring only to navigation correctness (determining whether I'm on the right road) and not determining whether it's safe for me to be on that road. Here, automation reduces what is necessary for my brain to hold in its working memory, so its less capable of answering questions about the route.

However, my intent is not to absolve drivers. The driver is still "ultimately responsible." Of that there is no question, because Autopilot is not fully autonomous.

My point is that Autopilot renders drivers less capable, even while they remain as responsible. And Tesla, for its part, has it's own, different kind of ultimate responsibility: to avoid negligent design of their products, by disabling Autopilot on road types where the statistics show it results in a net increase in accidents and fatalities.



That depends on whether Autopilot is being used on a road that it is suitable for, or a road that it is not. On divided highways I would suspect that Autopilot, while still turning many people into zombies, reduces the accident rate. Autopilot works better here, and many people already zone out on the highway (the task is too boring), so having Autopilot also watching is a net win. When it comes to fatalities, not just accidents, the data is less clear, and hints at a different story.

On non-divided ("restricted") roads we have no distinct data sets to evaluate, but based on the poor showing of Autopilot in the most recent accidents, I personally believe that Autopilot is a net loss of safety. More net zombies created, less capable Autopilot.
I don't agree to the conclusion that AP is a net loss of safety based on three accidents as these three incidents represent a very very minute fraction of overall miles that AP has worked correctly and safely. Incorrectly applied seat belts or child seats that are incorrectly used also kill but that doesn't mean seat belts or child seats are net loss of safety.
 
Screenshot: https://teslamotorsclub.com/tmc/attachments/image-png.185224/
is here on Google Maps:
Google Maps

It appears this is where the post barrier starts and the reflectors on the posts may have confused AutoPilot.

I think you may have a point about AP getting confused by the reflectors. A few days ago, driving with the sun at my back, AP seemed to mistake the rumble strip next to the lane line (which was lit up by the sun) for the lane line, and it swerved towards it. This seems to be one of the biggest shortcomings of AP right now. I've seen it get confused about what is/isn't a lane line a number of times. Fortunately I was always able to take back control in time.
 
  • Informative
Reactions: madodel
Actually the real lesson to be learned is TO ALWAYS BE LOOKING FORWARD when using AP and BE PREPARED TO TAKE OVER AT ANY TIME.

That's one lesson. There's also the fact that if you're hands are on your lap it's going to take longer to respond in an emergency.

Perhaps Autopilot can refuse to activate on certain roads?

But if you're going to introduce automatic steering it has to be idiot proof. Either the car steers itself safely enough on certain roads or it doesn't. And that means (at least to me) not hovering over the steering wheel with your hands or loosely holding the wheel.

There's a logical disconnect here. Yeah, you can always blame the idiot driver. There's a reason why they print "Remove Foil Before Use" on suppository labels.
 
  • Funny
  • Like
Reactions: madodel and srini
My Wechat group friend went back to accident place took some photo where the accident happened. I don't know how to save the video from Wechat message, took screen shot attached. I'm a huge fan of tesla and Elon, own tesla stocks. The reason I'm posting here is want to let drivers to pay attention and learn from this accident. What what this photo shows, the road look in a condition I would enable AP. I have learned high way is still risky, but I will only do it on free way in future.

I have located the accident scene based on your images. It was 1.9 miles south of the point MT-2 deviates from Interstate 90. My guess is your friend enabled Autopilot at the point the road straightens out going south, or about 1.6 miles from the accident point, which was at the end of a curve about 170 metres duration, with a change in azimuth from 144 to 161 degrees. The shoulder and center lines are well marked at that location. However, it is possible the reflectors on the guard posts confused autopilot, although looking at Google street view most posts are not marked, maybe only 3 in the whole length of the guardpost section. Your friend would have almost no time to react as there is no shoulder there, but it appears as if the Model X just kept on turning after the road straightened out.
 
  • Informative
Reactions: Dave EV
But if you're going to introduce automatic steering it has to be idiot proof. Either the car steers itself safely enough on certain roads or it doesn't. And that means (at least to me) not hovering over the steering wheel with your hands or loosely holding the wheel.
.

Well then you will never see it. I suspect it will be awhile before the system predicts a meteor strike in front of the car and can take appropriate predictive actions. None of the present systems can handle a tire in the road. MobilEye is working on it and has some good lab results but I don't know of anything that does it in practice.

The systems will get better and better. Right now AEB reduces but doesn't eliminate rear ending the car in front of you. The insurance industry already loves it. Eventually all of these systems will handle 99% of the cases but there will still be corner cases not covered.
 
That's one lesson. There's also the fact that if you're hands are on your lap it's going to take longer to respond in an emergency.

Perhaps Autopilot can refuse to activate on certain roads?

But if you're going to introduce automatic steering it has to be idiot proof. Either the car steers itself safely enough on certain roads or it doesn't. And that means (at least to me) not hovering over the steering wheel with your hands or loosely holding the wheel.

There's a logical disconnect here. Yeah, you can always blame the idiot driver. There's a reason why they print "Remove Foil Before Use" on suppository labels.
The only way to make it idiot proof is going fully autonomous and take the control away from idiots. However, that cannot be achieved overnight or by testing in the lab only.

How many people died when Ford first made the car ?
 
  • Like
Reactions: EVie'sDad
How come they didn't list the brand of the "3,000-pound death sled." Sounds like that should be recalled.
Besides getting used to the favorable attention, this is the other thing we have to worry about as Model X owners - if you're in a crash, it becomes national or maybe international news. Every one is all over you analyzing every single aspect and word you utter. My family isn't prepared for that kind of public scrutiny.
 
Let me also add that there's a fundamental dissonance going on here. Autopilot steers the car and keeps the car between the lines, etc.... But if the disclaimer is that you always need to pay attention and keep your hands on the wheel just in case something happens....then is it really "Autopilot"? And yes, I understand the real usage of AP. But one has to admit it's a subtle difference or situation for Joe Average driver to comprehend. Even if you take you're hands off the wheel for a moment - something bad could happen and those fractions of a second could mean the difference between life or death. In reality, one should never remove their hands from the wheel or be distracted by driving. And if that's the case - then what is Autopilot, really?

Great point!

My 1962 Cessna 182E had a 1 axis autopilot: it would keep wings level and would perform a smooth and steady bank on command. This allowed hands free flying when straight and level or relatively shallow turns. A more expensive avionics was a three axis autopilot which would also hold altitude or a constant rate of climb/descent.

More advanced autopilots added homing on VOR radials, radial intercept, climb and maintain, and many other features. Modern airliners can include auto land, and have features allowing nearly touchless flight for 95% of a trip.

But in NO CIRCUMSTANCE does any of that take away the responsibility of the pilot for the operation of the aircraft. Autopilot has never meant "set it and go to sleep."

So it may be a confusing term. But it is auto-"pilot" because it was invented a very long time ago to make flying safer.

I think Tesla is doing the same thing: making transport safer.
 
Then how did he get a US/Montana drivers license? Isn't he expected to be able to read and understand and accept the EULA when enabling Beta software features, did he just irresponsibly ignore the warnings? I think so...

It is not impossible to be a Chinese speaker and still pass a driving test in some states that welcome non-English speakers.

He's from Seattle, WA not MT. Here's the official Driver Guide in Chinese. Looks like all the warning signs are in Chinese :)

NjTjw3k.jpg
 
  • Informative
Reactions: xkwizit

Whatever the involvement of the AP, I do find it quite spectacular how Tesla doesn't seem to hesitate for a minute when it comes to bashing their own customers in public statements. They even denied involvement of the AP in that case where they had to admit later that they hadn't been able to retrieve any data from the heavily damaged vehicle.

Apparently the first point of action for a Tesla driver after an accident should be to disable any further communication of the vehicle with Tesla. Otherwise the driver has to be prepared that Tesla will make all sorts of claims as to the driver's negligence by pointing at data which can not be verified. With billions of USD at stake, why should there be an automatic assumption that Tesla's claims are correct? They even shove in their claim that driving with AP is safer than driving without, even though the statistical support for this is still rather slim.
 
Whatever the involvement of the AP, I do find it quite spectacular how Tesla doesn't seem to hesitate for a minute when it comes to bashing their own customers in public statements. They even denied involvement of the AP in that case where they had to admit later that they hadn't been able to retrieve any data from the heavily damaged vehicle.

Apparently the first point of action for a Tesla driver after an accident should be to disable any further communication of the vehicle with Tesla. Otherwise the driver has to be prepared that Tesla will make all sorts of claims as to the driver's negligence by pointing at data which can not be verified. With billions of USD at stake, why should there be an automatic assumption that Tesla's claims are correct? They even shove in their claim that driving with AP is safer than driving without, even though the statistical support for this is still rather slim.

Whether there is Autopilot or not, unlike Google, Tesla has not deprived its customers of good old classic brake pedal and steering wheel.

As far as we know, these accidents didn't happen to those who are missing limbs that prevent them from using brake pedal and steering wheel.

Just because there is a word of Autopilot, it doesn't mean suddenly people just lose all their limbs for brakes and steering wheel.
 
Whether there is Autopilot or not, unlike Google, Tesla has not deprived its customers of good old classic brake pedal and steering wheel.

As far as we know, these accidents didn't happen to those who are missing limbs that prevent them from using brake pedal and steering wheel.

Just because there is a word of Autopilot, it doesn't mean suddenly people just lose all their limbs for brakes and steering wheel.

True but rather irrelevant here. Tesla doesn't issue a generic statement that the driver is responsible for the conduct of the vehicle and that he has the means to do so. No, they explain in detail what the driver allegedly did and why his actions caused the accident.