Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
Because the 99% of owners are actually responsible would not like the use and functionality of their cars limited in order to cater to future Darwin Award winners.

I hate to belabor the point, but how is it responsible to use AutoPilot on roads that Tesla explicitly claims it is not intended for? My owners manual says:

"Autosteer is intended for use on freeways and highways where access is limited by entry and exit ramps."
 
It seems odd that you can't use (or shouldn't use) a safety assist system on all roads.

I think a lane keeping assist system should be available as a back up to the driver on all roads, not just freeways.

You said it right safety assist... not safety relegate responsibility to.

In the best situations I'm always monitoring autopilot to make sure nothing unexpected is coming up and when I do see something unexpected like a curve in the road I always put my hand on the wheel. People have to understand it's only safer when both the computer and the person are working together.
 
I have written this before, I believe you should have to PASS a training class online or ?? to be ABLE to use autopilot, or operate your car for that matter. How many accidents have we seen with newbie drivers not recognizing the speed, Acceleration, range power and limitations of driving an EV. I hate the big borther attitude, but perhaps this would take some of the heat off of Tesla every time someone screws up!! o_O
 
...
There seems to be quite a rapid succession of reports of drivers using AP in ways that the software/hardware is not capable enough for to handle.

Bingo.... No set of instructions or disclaimers is ever going to overcome the fallibility of human nature.

This is why I believe we can't get to L4 functionality soon enough. The road until that point will continue to be a bumpy, and expensive one. The fear is, once tort attorneys get a hold of the process, it might become so expensive, that doing so via "public betas" might come to an end. Let's hope that AP2 get's us way down the path on this journey.
 
  • Like
Reactions: NoEggs
I hate to belabor the point, but how is it responsible to use AutoPilot on roads that Tesla explicitly claims it is not intended for? My owners manual says:

"Autosteer is intended for use on freeways and highways where access is limited by entry and exit ramps."

Driving has always been about using judgement. Even with something basic like speed, a driver needs to use his/her judgement on appropriate speed. There are times when driving 10 mph over the speed limit is fine (and even necessary) and there are times when driving 10mph under the speed limit is still too fast.

Similarly, there is no binary good road/bad road decision. There will be some roads that are obvious good candidates for AP and some that are obvious poor candidates for AP and there are going to be a whole lot of roads that are going to fall into the grey area in the middle. Tesla could take the position that they will only allow AP to engage on roads that are 100% AP-friendly or they could take the position that most of our owners are grown-ups and, given appropriate guidance, they can make their own decisions.

Also, bear in mind that AP is a learning system, so the only way AP will ever get better at navigating other types of roads is by actually traversing those roads.
 
  • Like
Reactions: Oil4AsphaultOnly
The older style wooden posts and cable in post #37 My friend model X crash bad on AP yesterday indicate a secondary road and the barrier's proximity to the road suggest a curve and drop off. The 4th photo at that link shows railway tracks at the back of the lighted area.

I'm not saying the link below is the location but it fits with a curve, posts and cable, embankment, railways tracks at the bottom, no cell reception, and is close to Whitehall MT in the cell phone screen shot. Sooner or later Tesla will confirm the location of the crash but I suspect it is very much like this:

Google Maps
 
Because the 99% of owners are actually responsible would not like the use and functionality of their cars limited in order to cater to future Darwin Award winners.

Given the frequency with which these stories are being reported, I think your 99% might be a little optimistic. Never underestimate the power of stupid people in large groups...especially when a lawyer starts seeing dollar signs.
 
  • Like
Reactions: BrokerDon
As someone who owns an AP car I've had just the opposite experience. Very intuitive. Easy to engage, use, disengage. Clear visible and audible indicators when engaged. Clear visible and audible indicators when disengaged.
Within a few minutes of using it I knew what to expect...and what it could and couldn't do. Basically assistance with speed control and assistance with lane keeping.
As an "old guy" who can't figure out Snapchat if I can figure this out then anyone can!
@idoco, I too am clueless about Snapchat ;) and I found using AP pretty straightforward. But then I was using it as Tesla intended: on a divided road with no cross traffic. Before I used it for the first time, I had read quite a bit about it.

However, there are many people in the world who don't read the manual and exercise poor judgement while driving, and may not have the mental acuity that you do. To them, AP may be confusing because it is a rather different driving experience. They may benefit from a formal training course run by Tesla.
 
  • Helpful
  • Informative
Reactions: idoco and neroden
You said it right safety assist... not safety relegate responsibility to.

In the best situations I'm always monitoring autopilot to make sure nothing unexpected is coming up and when I do see something unexpected like a curve in the road I always put my hand on the wheel. People have to understand it's only safer when both the computer and the person are working together.

Agree 100% that the person and computer need to work together, which is why I think it's odd that the computer, aka AutoPilot, should be turned off on anything but divided highways.

A safety assist system should always be available no matter what kind of road you are on.

Am I looking at this wrong? Why wouldn't lane keeping always be available to potentially save the day if the driver screws up?
 
  • Disagree
Reactions: Topher
There seems to be quite a rapid succession of reports of drivers using AP in ways that the software/hardware is not capable enough for to handle.
The trouble with trying to make things foolproof is that there are a lot of very ingenious fools.

Autopilot does OK on the expressway, particularly if the lines on the road are painted well. It seems to do OK at very low speed in cities.

It should not be used on high-speed, undivided roads with intersections. There's no way for the software to do lanekeeping on those roads, none at all, and it shouldn't pretend to.

Perhaps automated braking could be kept on on these roads but not lanekeeping.

It's probably not straightforward to geofence Autopilot to only work on expressways. If I were Tesla, I'd still try to do so immediately.

My local Tesla owners club has, among those who own cars with Autopilot, *one* couple who are using it responsibly on the Interstate (it works great), and a whole bunch who keep trying to use it on rural roads (it doesn't work well, duh).
 
First part, that only works if Tesla has a system in place to report and correct false road data. If not, then you're just going to have a lot of frustrated customers who's AP cars won't engage on a road that's clearly a divided highway because the map database is wrong. Furthermore, there are different kinds of divided highways. Limited access, non-limited access (freeway or at grade intersections)... Where do you draw the line?
Limited access only. We already know the automated braking not really reliable on non-limited-access roads.
 
Driving has always been about using judgement. Even with something basic like speed, a driver needs to use his/her judgement on appropriate speed. There are times when driving 10 mph over the speed limit is fine (and even necessary) and there are times when driving 10mph under the speed limit is still too fast.

Similarly, there is no binary good road/bad road decision. There will be some roads that are obvious good candidates for AP and some that are obvious poor candidates for AP and there are going to be a whole lot of roads that are going to fall into the grey area in the middle. Tesla could take the position that they will only allow AP to engage on roads that are 100% AP-friendly or they could take the position that most of our owners are grown-ups and, given appropriate guidance, they can make their own decisions.

You could make the same argument regarding drunk driving: some people above a certain blood alcohol limit can drive well, and some cannot. It's up to the individual driver to know when they can drive home safely, and when they should take a cab. They should exercise appropriate judgement.

Society and the courts do not agree. They've decided that, for the safety of all, any blood alcohol level above .08% is a hard line that constitutes impairment, and it's not left up to the judgement of the driver whether to drive or not. It turns out that many people have terrible judgment.

Similarly, I view the operation of Autopilot as drastically impaired on certain types of roads and certain weather conditions. It makes decisions which, if left uncorrected by the chaperone, result in accidents (e.g. running stop signs). It is effectively driving under the influence. I don't think society is going to treat any system which behaves as if it were drunk as though it were not drunk, "because computers", or leave it up to the driver to decide how impaired they think the system will act at any given point in time. I expect a zero tolerance policy will eventually apply, but I guess we'll see. It's possible that Autopilot bootstraps itself to a safe level on all roads and weather conditions before the law catches up.
 
Last edited:
I can see why it may not make business sense to do so. However, from a safety perspective, I don't see any reason why Tesla should limit the top speed of AP on known unsafe roads, yet not go the full mile and disable AP completely. As it stands now, Tesla is explicitly programming the car to allow use of AutoPilot on roads they know it was not designed for, just with restricted speeds. If Tesla knows AP is unsafe in a particular situation, and it is reasonable to turn it off, it should not be allowed at all. Otherwise you have negligent design. That differs from maximum speed restrictions on cars, where it is impossible to judge in real time whether the speed is appropriate or not.
Max speed restrictions are easy to judge with the speed limit signs and GPS. And even absent of that there are clearly speeds that public roads are not suitable for (for example, there is no public road in the USA where 106mph would not be considered reckless driving; highest posted limit in US is 85mph and 20 mph above posted is considered reckless).

Again, you missed the analogy raised by others. Is it negligent for an automaker to release a car with a 155mph top speed (or even higher) and not disable that speed on public roads? They certainly have the technology to do so (for example the Skyline GTR does that). Why haven't automakers done that broadly, and why haven't people called them negligent? Trying to answer that question will give you an idea of the point people are making.
 
Last edited:
  • Like
Reactions: napabill
Hey All - Former Montana and Whitehall Resident here. First off - This is most likely driver error. Not in the fact the driver made an active mistake, but more-so used the system when they shouldn't have.

That road is riddled with potholes, there are drop-offs on both sides into a ditch, with and without a line. In some areas the vegetation on the side of the road over-grows and covers the lines. The center lines are not marked correctly and often disappear. Not to mention numerous turn offs without markers, lines, or notification. Hell, a person with 20/20 vision can sometimes not see on that road at night. Montana is absurdly dark, Whitehall has no major cities nearby that can cause light pollution. The roads are exceptionally dark - So dark you can see the milky way.

You can see in the pictures @Mark Z Posted. If you look at the date, it says 2013 on the google maps link. Montana is not big on keeping those roads up to date so it's most likely that road is in even worse shape now than when it was pictured.

In the second set of pictures from the OP you see the wood stakes. Those are placed there by the County / Montana DOT. And are in fact older style guard rails. (See How up to date Montana DOT is?)
 
Why wouldn't lane keeping always be available to potentially save the day if the driver screws up?

I'm not sure I understand what you're asking. Lane keeping (aka, AutoPilot) is always available as long as there are road/lane markings clear enough for the system to see/use.

In addition, when the driver disables AutoPilot he/she still has the option to have the Lane Departure Warning system turned on at all times.

Mike
 
Again, you missed the analogy raised by others. Is it negligent for an automaker to release a car with a 155mph top speed (or even higher) and not disable that speed on public roads? They certainly have the technology to do so (for example the Skyline GTR does that). Why haven't automakers done that broadly, and why haven't people called them negligent? Trying to answer that question will give you an idea of the point people are making.

Actually, in the UK heavy trucks are speed-limited to 56 mph, and in Japan most consumer cars are limited to 112 mph. Elsewhere, it's a different story, of course.

So far, nobody has attempted to answer that question. They've only cited fast cars as an existence proof that anything goes when it comes to negligence. I thought I answered it in a prior post, but I'm sure I did a terrible job, so I'll try again. I believe that cars are still designed to exceed public speed limits for the following reasons:
  • When everyone else is exceeding the speed limit, it's safer if the car can, too.
  • During an emergency, it may be necessary to exceed the speed limit.
  • GPS and the automatic determination of speed limits is new technology, unproven, and unreliable. If such a system fails, the public can be at risk due to the points above. They're even more at risk should they be erroneously limited to a speed far below the speed limit (i.e. the system places them on the wrong road).
  • If a consumer knows a car can go 155 mph, they know it has the power, and thus the acceleration, to operate safely in highway merge scenarios, etc. If it's limited, they begin to suspect it won't be able to accelerate safely. This perception is hard to overcome.
  • The number of crashes caused by high speeds are relatively low. So low, in fact, that it would be cheaper for automakers to settle lawsuits than to suffer lost sales from speed-limited cars. It would likely take a federal law mandating governors before manufacturers would comply.
So, because restricting speed would actually decrease safety in some scenarios, and increase the liability of automobile manufacturers (points one, two, and three), there's been no precedent yet set where a court has ruled that automatically enforcing a geofencing speed governor can be implemented reasonably. Thus, the group consensus remains that a GPS-backed governor is not yet a reasonable design choice. If it's not reasonable, there's no negligence, and automakers would very, very much prefer to retain high maximum speeds because of point four. They'd rather just pay the victims.

Note that in the U.S. the Skyline GTR is limited to 156 mph and not GPS enabled. It's only in Japan, where all cars are limited to 112 mph, that the Skyline offers a GPS geofence option, and even there it only detects whether or not you're at a race track and removes the static limiter. Hardly robust technology.

However, Autopilot *can* be restricted reasonably, because:
  • It's feasible. In fact, it's implemented already (Tesla already restricts the speed of Autosteer).
  • There no safety issue with disabling Autopilot in cases where its performance is questionable. You still have full use of the car when under manual control.
Unlike high speed crashes, which are rare, there will soon be many, many cars on the road with an autopilot-like feature (thank you, MobileEye), so I think the net liability risk will soon grow to the point where any manufacturer who implements autonomous technology will try very hard to prevent it from being used when it shouldn't be. This may be one reason why other manufacturers have proceeded very cautiously with their autonomous rollouts.
 
Last edited:
  • Like
Reactions: napabill