Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
We use AP all the time, but watch it like hawks, it's too flaky to be left unsupervised.

One thing I do wish Tesla would add is a much clearer notification of when it's in AP and when it turns off. I've hit the edge of the brake pedal more than once and turned off AP accidentally. It dings, but with music playing it's not all that obvious, especially when I'm keeping an eye on traffic (since I don't really trust AP).
 
  • Like
Reactions: neroden and Matias
John, I say this with no intent to provoke: how many miles have you driven on AP?

I do not own an AP car but have driven my parents car on AP for a few hours in total, and only on Southern California freeways. My opinion is that, for many people (but nowhere near "all" people) AP is in fact complex enough -- and is a nsufficiently different driving experience -- to warrant a real life training period with a properly trained person in the passenger seat.

Zero. However, I am familiar with the way the system works and my opinion is that if it's still too complicated for people after seeing an instructive video (I'm talking a professionally done) then I have a hard time believing that a training course will help. Most of these "accidents" appear to be from knowingly misusing the system (which is not something I believe training would prevent) or were used as a means to deflect responsibility.

If, in your experience, you feel that a training course would have benefited people in these current cases...I will have to defer to you on that.
 
MODS: Since this thread has now been picked up by the media and being portrayed at another AP crash, I suggest a more accurate title, since its not clear what happened and whether AP was, in fact, involved.

Since "AP ate my homework" seems to be the new handy excuse for owners that might have made some questionable choices, I would suggest some TMC policy around this, so TMC does not become the incubator for more BS stories as it did for SuspensionGate.
 
I need someone - probably with training as a lawyer - to let me know whether the following set of enthymemes is valid:

1.
IF we can posit that a manufacturer who creates and sells an auto that can reach sustained speeds of 160mph, to be able effectively to rely on common sense, traffic laws and experience to avoid liability exposure against an irresponsible driver who uses such vehicle to drive in reckless fashion (e.g., 150mph on a back road with lousy shoulders at 2am - just to throw out an example.... ;) )

2.
THEN we also can expect a manufacturer who releases a vehicle that has a Driver Assist function likewise to rely on same in order to shield itself effectively against a similar nincompoop?

The overall reason for this gedankenexperiment is because I am reading that not only the US NHTSA but also the NTSB AND Germany's something-or-other are becoming involved in all this.
 
Last edited:
I need someone - probably with training as a lawyer - to let me know whether the following set of enthymemes is valid:

1.
IF we can posit that a manufacturer who creates and sells an auto that can reach sustained speeds of 160mph, to be able effectively to rely on common sense, traffic laws and experience to avoid liability exposure against an irresponsible driver who uses such vehicle to drive in reckless fashion (e.g., 150mph on a back road with lousy shoulders at 2am - just to throw out an example.... ;) )

2.
THEN we also can expect a manufacturer who releases a vehicle that has a Driver Assist function likewise to rely on same in order to shield itself effectively against a similar nincompoop?

The overall reason for this gedankenexperiment is because I am reading that not only the US NHTSA but also the NTSB AND Germany's something-or-other to be getting involved in all this.

I don't think it is valid, but I'm not a lawyer. My reasoning:

A manufacturer can sell a car that goes to 160 mph and can be driven recklessly because it is "unreasonable" to prohibit them from doing so. Until recently there was no technology to "nanny" a driver and prevent them from doing stupid things--at least reliably. There are perfectly valid reasons to make a car that goes 160 mph: for use on the Autobahn, or in Montana during the 90s (no speed limit), or on a private road, or a closed race course, or in an emergency, for example.

If it is impossible for a manufacturer to reasonably prevent misuse, and there are valid uses, you can sell something that allows misuse. Any restrictions are unreasonable.

However, now that GPS and fancy computers exist within cars, it is reasonably possible to prevent some forms of misuse. If a manufacturer can forsee harm, and can take reasonable precautions to prevent it, they must do so to avoid liability. One example is to disable Autopilot on roads that Tesla states are unfit for Autopilot. These roads are already known to Tesla, because they set AutoSteer speed limits on them via GPS. So in this very specific case, the technology exists to reliably prevent harm. This is unfortunate for Tesla, because they collect valuable training data when Autopilot is corrected on these roads.

Now, once it can be shown that Autopilot is statistically safer on these roads than human drivers, then Autopilot should be allowed, if not encouraged.
 
Last edited:
  • Like
Reactions: KZKZ
And here we go ...

Clearly, on a dark, two lane undivided back road at 2AM with sketchy lane markers for the camera to work with and no car ahead for the radar to follow, why *wouldn't* you take her up to 60 on full autopilot to ensure a safe trip and impress your friends.

What hours of the day does AP show up to work? :)

I would think a system that is marketed as being TWICE as safe as an average driver would be especially invaluable while driving late at night when a driver could become sleepy.
 
Last edited:
  • Disagree
Reactions: Topher
What hours of the day does AP show up to work?

I would think a system that is marketed as being TWICE as safe as an average driver would be especially invaluable while driving late at night when a driver could become sleepy.

The point is that it's not designed for use on this type of road. If you decide to use it anyway, and since the system is supposed to work with driver awareness, doing so at night (where visibility is limited) is probably not the best time.
 
  • Like
Reactions: neroden
The point is that it's not designed for use on this type of road. If you decide to use it anyway, and since the system is supposed to work with driver awareness, doing so at night (where visibility is limited) is probably not the best time.

Why would a Lane Keeping Assist (wink wink) system not be allowed to work on that type of road?

Isn't the purpose of a LKA system to nudge you back on the road if you get close to the line?

This accident is the perfect example of where a properly designed LKA may have helped.
 
If it's not designed for that road, then why does Tesla allow it's use on that road?
As mentioned by others, the same reason why automakers don't lock out the top speed of cars according to road conditions (even though technically manufacturers certainly can do that).

The other technical aspect is the avoidance of false positives that would make the usage of the system extremely annoying. I just came across an example: GPS units have a legal warning that addresses should not be entered while driving, and the software certainly can enforce that (by detecting movement and refusing to let you enter an address), but GPS manufacturers don't implement that.
 
Except... Tesla does do this. The top speed of AutoSteer is artificially limited depending on the type of road. So it's possible, and is happening.
Yes it limits autosteer top speed, but it doesn't limit the car's top speed. You can always still go the top speed in any road you want. Basically my point was the reasoning is similar for why it is not reasonable to expect Tesla to disable autopilot (or other functionality) automatically based on road conditions. Just because it is technically possible to do so, does not mean it necessarily makes sense to do so.
 
Last edited:
  • Like
Reactions: Pdub2015
Why would a Lane Keeping Assist (wink wink) system not be allowed to work on that type of road?

Isn't the purpose of a LKA system to nudge you back on the road if you get close to the line?

This accident is the perfect example of where a properly designed LKA may have helped.

The system has limitations, and is the reason for the recommended usage. I don't think it's fair to place blame on the system, when using it outside of its designed parameters.

LKA doesn't mean it will work effectively on every road, simply because the road has a "lane." The system can't counter animals, or vehicles, veering into the lane it is "keeping." So why would someone use the system on a 2-lane country road, at night?
 
Last edited:
The system has limitations, and is the reason for the recommended usage. I don't think it's fair to place blame on the system, when using it outside of it's designed parameters.

LKA doesn't mean it will work effectively on every road, simply because the road has a "lane." The system can't counter animals, or vehicles, veering into the lane it is "keeping." So why would someone use the system on a 2-lane country road, at night?

If you have to turn AutoPilot off while on the most common type of road in the US, maybe AutoPilot really isn't just an ACC + Lane Keeping Assist system after all.

A Lane Keeping Assist system is supposed to assist the driver if the driver unintentionally leaves his lane of travel. Why would you turn that safety net off?
 
I think both the NHTSA and NTSB reviews will be useful. My predictions:
  • The NHTSA investigation will find there is no intrinsic flaw in AP HW and SW that cause accidents when used as directed by Tesla
  • The NTSB investigation may make some recommendations AP UI enhancements based on their experience with aviation autopilot--things like clearer indications of when the car is/is not in AP of when AP gets disengaged.
I do not think either agency will act in a way that absolves the driver of their responsibilities to use good judgement.
 
Yes it limits autosteer top speed, but it doesn't limit the car's top speed. You can always still go the top speed in any road you want. Basically my point was the reasoning is similar for why it is not reasonable to expect Tesla to disable autopilot (or other functionality) automatically based on road conditions. Just because it is technically possible to do so, does not mean it necessarily makes sense to do so.

I can see why it may not make business sense to do so. However, from a safety perspective, I don't see any reason why Tesla should limit the top speed of AP on known unsafe roads, yet not go the full mile and disable AP completely. As it stands now, Tesla is explicitly programming the car to allow use of AutoPilot on roads they know it was not designed for, just with restricted speeds. If Tesla knows AP is unsafe in a particular situation, and it is reasonable to turn it off, it should not be allowed at all. Otherwise you have negligent design. That differs from maximum speed restrictions on cars, where it is impossible to judge in real time whether the speed is appropriate or not.
 
  • Like
Reactions: neroden and KZKZ
If you have to turn AutoPilot off while on the most common type of road in the US, maybe AutoPilot really isn't just an ACC + Lane Keeping Assist system after all.

A Lane Keeping Assist system is supposed to assist the driver if the driver unintentionally leaves his lane of travel. Why would you turn that safety net off?

You don't turn it off, as much as not turn it on. It's designed for freeway use, there is no point in arguing for its usability outside of this.
 
I need someone - probably with training as a lawyer - to let me know whether the following set of enthymemes is valid:

....

The overall reason for this gedankenexperiment is because I am reading that not only the US NHTSA but also the NTSB AND Germany's something-or-other are becoming involved in all this.

Unfortunately, in many cases this rational approach to thinking of the problem doesn't matter, since the decisions can wind up being made by a jury or a government worker.

Not quite the same, and a lawsuit is not the same as a victory, but interesting nonetheless: Woman Follows Google Maps "Walking" Directions, Gets Hit, Sues

Remember, in the U.S. you have to put a label on the top of a ladder warning people to not stand on it, because if they do, and fall off, woe be to you!
 
However, from a safety perspective, I don't see any reason why Tesla should limit the top speed of AP on known unsafe roads, yet not go the full mile and disable AP completely.

Because the 99% of owners are actually responsible would not like the use and functionality of their cars limited in order to cater to future Darwin Award winners.
 
  • Like
Reactions: Topher