Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

My friend's model X crashed using AP yesterday

This site may earn commission on affiliate links.
[QUOTE="KZKZ, post: 1626544, member: 49372"...A safety assist system should always be available no matter what kind of road you are on...[/QUOTE]

In future, advanced Autopilot may work in all kinds of roads but not now.

With this first version, it is important for driver to intervene when the steering is off the mark.
 
Agree 100% that the person and computer need to work together, which is why I think it's odd that the computer, aka AutoPilot, should be turned off on anything but divided highways.

A safety assist system should always be available no matter what kind of road you are on.

Am I looking at this wrong? Why wouldn't lane keeping always be available to potentially save the day if the driver screws up?
Because that's not what its designed for.
 
The older style wooden posts and cable in post #37 My friend model X crash bad on AP yesterday indicate a secondary road and the barrier's proximity to the road suggest a curve and drop off. The 4th photo at that link shows railway tracks at the back of the lighted area.

I'm not saying the link below is the location but it fits with a curve, posts and cable, embankment, railways tracks at the bottom, no cell reception, and is close to Whitehall MT in the cell phone screen shot. Sooner or later Tesla will confirm the location of the crash but I suspect it is very much like this:

Google Maps

So how much google maps driving did it take to find that? It's on a completely different road than the OP claimed.
 
Similarly, I view the operation of Autopilot as drastically impaired on certain types of roads and certain weather conditions. It makes decisions which, if left uncorrected by the chaperone, result in accidents (e.g. running stop signs). It is effectively driving under the influence.

That is the point, AP is not driving, the driver should be driving. In the example you cite, your are asking AP to do something it was never intended to do and never advertised as doing. If an AP car rolls through a Stop sign, that's 100% on the driver.
 
The trouble with trying to make things foolproof is that there are a lot of very ingenious fools.

Autopilot does OK on the expressway, particularly if the lines on the road are painted well. It seems to do OK at very low speed in cities.

It should not be used on high-speed, undivided roads with intersections. There's no way for the software to do lanekeeping on those roads, none at all, and it shouldn't pretend to.

Perhaps automated braking could be kept on on these roads but not lanekeeping.

It's probably not straightforward to geofence Autopilot to only work on expressways. If I were Tesla, I'd still try to do so immediately.

My local Tesla owners club has, among those who own cars with Autopilot, *one* couple who are using it responsibly on the Interstate (it works great), and a whole bunch who keep trying to use it on rural roads (it doesn't work well, duh).

Actually it works bluddy well on well marked two lane rural roads with traffic. You must pay attention to intersections and take over as needed, and it's speed limited, but it does very well indeed.

Have you ever used it for commuting on roads like that? I have, and would be VERY annoyed it I was geofenced out of that ability (for which I have paid) by some nanny.

If that makes me an ingenious fool, I will wear the title gladly.
 
When everyone else is exceeding the speed limit, it's safer if the car can, too.
During an emergency, it may be necessary to exceed the speed limit.
To clarify, I'm not talking about simply exceeding the speed limit. I'm talking about reckless driving. There isn't going to be a situation where you *need* to go 106+mph.

GPS and the automatic determination of speed limits is new technology, unproven, and unreliable. If such a system fails, the public can be at risk due to the points above. They're even more at risk should they be erroneously limited to a speed far below the speed limit (i.e. the system places them on the wrong road).
Again, I'm talking about limiting speeds to 106 mph. There isn't going to be any safety risk for banning that.

If a consumer knows a car can go 155 mph, they know it has the power, and thus the acceleration, to operate safely in highway merge scenarios, etc. If it's limited, they begin to suspect it won't be able to accelerate safely. This perception is hard to overcome.
We already have other metrics for acceleration, 0-60 for highway merge and 50-70 mph performance for passing. The top speed doesn't have anything to do with that.

[*]The number of crashes caused by high speeds are relatively low. So low, in fact, that it would be cheaper for automakers to settle lawsuits than to suffer lost sales from speed-limited cars. It would likely take a federal law mandating governors before manufacturers would comply.
Citation required for low number of high speed crashes (it is very easy to google crashes of cars going 100mph), but if you are using that argument, the number of crashes caused by automakers not limiting lane keeping to divided highways is relatively low. By that logic why wouldn't "It would likely take a federal law mandating (lane keeping) governors before manufacturers would comply." apply?

Note that in the U.S. the Skyline GTR is limited to 156 mph and not GPS enabled. It's only in Japan, where all cars are limited to 112 mph, that the Skyline offers a GPS geofence option, and even there it only detects whether or not you're at a race track and removes the static limiter. Hardly robust technology.
I'm talking about exactly the same type of limiter. There are only going to be a limited amount of tracks in the US where you can go significantly higher than 100mph (I know this from threads discussing suitable locations to safety test the top speed of the Model S). A GPS based limiter of this type would be very easy to implement. The same reason that automakers don't implement it (and nobody expects them to) would be the same reason why people are arguing for Tesla not to put a restriction.

However, Autopilot *can* be restricted reasonably, because:
  • It's feasible. In fact, it's implemented already (Tesla already restricts the speed of Autosteer).
  • There no safety issue with disabling Autopilot in cases where its performance is questionable. You still have full use of the car when under manual control.
Unlike high speed crashes, which are rare, there will soon be many, many cars on the road with an autopilot-like feature (thank you, MobileEye), so I think the net liability risk will soon grow to the point where any manufacturer who implements autonomous technology will try very hard to prevent it from being used when it shouldn't be. This may be one reason why other manufacturers have proceeded very cautiously with their autonomous rollouts.
People keep saying that "other manufacturers" are doing things "cautiously" but I found that to be false (the Infiniti video is a prime example). For this specific scenario, none of the other automakers have implemented locking out of lane keeping based on divided highways, even though they have the same (or superior sensors) that can detect a divided highway. There is a double standard here in terms of Tesla.
 
  • Like
Reactions: neroden
Here's google maps view of the "Winding Road" sign with recommended speed of 45 MPH right where wood post "guard-rail" starts (OP says that Tesla was going 60PMH):

Google Maps

Ah, now we're getting somewhere. I've found that Autopilot cannot read speed limit signs of this type, and "transient" speed limits, like those posted for tight curves, are rarely in the GPS database.

If the driver had maxed out Autopilot at the prior speed limit (55 MPH), it would not have slowed the car down in advance of the turn. Thus, even if Autopilot read the curve, it may have been going too fast to make the turn.

Autopilot normally slows down to roughly the speed limit when on undivided roads, so the driver might have assumed that it will always do so. However, it most certainly will not if it can't read the sign and the transient curve speed restriction is not in the database. The driver probably also missed the sign and relied on the Tesla-supplied speed limit indicator on the instrument panel, which was wrong.

If you need a Ph.D to figure it all of the nuances of operating Autopilot for roads it was never intended for, perhaps it shouldn't be allowed to activate on these roads. If the car slowed down to previous speed limits its not unreasonable (but very very wrong) to assume it always will. Yes, the driver was an idiot, but that doesn't mean he couldn't be made safer.
 
Last edited:
Here's an article that quotes the State Troopers who responded to the accident. They also say that it occurred on Highway 55 (not MT 2 as shown in another post here). I used Google Earth Streetview to travel several miles down that road, and with the exception of that bridge just south of Whitehall, it's a nice, wide mostly straight road with very wide right-of-ways on either side. If it did happen at that bridge, it's in a 45mph zone too. Here's part of the article:

Regarding the Montana accident, Trooper Jade Schope of the Montana Highway Patrol declined to identify either the driver or passenger, but he did say the driver said he activated the car's Autopilot driver assist system at the beginning of the trip.

"That's what he stated. I have no way of verifying whether it was or wasn't," Schope said. "He also stated that he was driving from Seattle to West Yellowstone, Mont."

The accident occurred after midnight Sunday morning after the driver had gotten off I-90 near Whitehall, Schope said.

There was a sharp drop-off from the two-lane Highway 55 when the driver told Schope the car began veering to the right where it hit a wooden guardrail. The driver was able to stop the vehicle before it left the road completely.

"He lost the right front wheel and there was extensive damage to the front of the vehicle," Schope said.
 
Last edited:
  • Informative
Reactions: Matias
If you need a Ph.D to figure it all of the nuances of operating Autopilot for roads it was never intended for, perhaps it shouldn't be allowed to activate on these roads. If the car slowed down to previous speed limits its not unreasonable (but very very wrong) to assume it always will.

You don't need a Ph.D to understand that you're ultimately responsible for driving the car. I've honestly reduced the amount I use AP because of the truck lust. I did that knowing that it was MY responsibility to drive my car in a safe manner. That if I crashed due to truck lust that it was going to be my fault. I felt like the infrequency of it happening combined with the sinusoidal correction (when I took control) meant that it exceeded what I felt safe with.

In terms of having a lock out on the feature it's tough to say. The entire feature is largely dependent on fleet learning, and it's a little hard to learn without ALL the data. Without the corrective action marking that happens.

Plus if you do start locking out roads then people might incorrectly assume it's safe on roads that aren't locked out. It took me less than 5 minutes of driving on a curvy single lane highway to realize that it wasn't safe to use in those moments. That feedback was important, and what if I don't get that feedback? Then I might start assuming it's better than it really is. By leaving it fairly unlocked it does a great job illustrating what it's good at and bad at.

On a personal level I don't believe in any level of nannying. It's part of WHY I bought a Tesla, and part of why I bought a Porsche a few years ago. I like things that assume the person is going to take responsibility to use it in a safe manner.
 
  • Like
Reactions: neroden
Here's an article that quotes the State Troopers who responded to the accident. They also say that it occurred on Highway 55 (not MT 2 as shown in another post here). I used Google Earth Streetview to travel several miles down that road, and with the exception of that bridge just south of Whitehall, it's a nice, wide mostly straight road with very wide right-of-ways on either side. If it did happen at that bridge, it's in a 45mph zone too. Here's part of the article:
Hwy 55 doesn't fit the topography or the post and cable barrier up against the road. The article also say the driver was travelling from Seattle to West Yellowstone MT which puts him on or close to MT2. Google Maps directions say take the MT2 exit.
 
  • Helpful
Reactions: Mark Z
Ah, now we're getting somewhere. I've found that Autopilot cannot read speed limit signs of this type, and "transient" speed limits, like those posted for tight curves, are rarely in the GPS database.

If the driver had maxed out Autopilot at the prior speed limit (55 MPH), it would not have slowed the car down in advance of the turn. Thus, even if Autopilot read the curve, it may have been going too fast to make the turn.

Autopilot normally slows down to roughly the speed limit when on undivided roads, so the driver might have assumed that it will always do so. However, it most certainly will not if it can't read the sign and the transient curve speed restriction is not in the database. The driver probably also missed the sign and relied on the Tesla-supplied speed limit indicator on the instrument panel, which was wrong.

If you need a Ph.D to figure it all of the nuances of operating Autopilot for roads it was never intended for, perhaps it shouldn't be allowed to activate on these roads. If the car slowed down to previous speed limits its not unreasonable (but very very wrong) to assume it always will. Yes, the driver was an idiot, but that doesn't mean he couldn't be made safer.

Speed limit is only the maximum speed allowed - it's driver's responsibility to adjust speed according to conditions and driver's abilities. If AP is driving too fast you are supposed to take over -no ifs about it.

(also lets keep in mind that in this crash Tesla has not even confirmed if AP was even on).
 
Hwy 55 doesn't fit the topography or the post and cable barrier up against the road. The article also say the driver was travelling from Seattle to West Yellowstone MT which puts him on or close to MT2. Google Maps directions say take the MT2 exit.

I know. That's why I wanted to point that out, but both the OP and the State Police, according to the article, say it happened on Highway 55. There's just something about this whole thing that doesn't pass the smell test IMHO.
 
I'm not sure I understand what you're asking. Lane keeping (aka, AutoPilot) is always available as long as there are road/lane markings clear enough for the system to see/use.

In addition, when the driver disables AutoPilot he/she still has the option to have the Lane Departure Warning system turned on at all times.

Mike

Don't bother Mike, KZKZ is trolling everyone on the forum, stop feeding him/her. KZ needs to read what the Manual says about AP for the first time.
 
im starting to fear Tesla will limit AP even more and screw it up for the rest of us because the stupidity of others :(

I feel the same but instead of stupidity I would call negligence. Coz as far as I understand person has to acknowledge that he/she has to be ready to take over any moment. Like if one wants to use technology be ready to comply with the warning and expectations.
E.g. if one is playing with registry keys of OS be ready to take responsibility of some screw up happening / be ready to revert bad setting. So should windows not allow people to change registry keys/ settings? Of course they should.

And if going by the logic that AP should be banned then most of the online banking and services, computers, skydiving etc would be banned / severely reduced.
This is a Level 2 system and is not supposed to be used without supervision.
 
To clarify, I'm not talking about simply exceeding the speed limit. I'm talking about reckless driving. There isn't going to be a situation where you *need* to go 106+mph.

On public roads, no. On private roads, yes. Manufacturers will claim that "Montana could always increase the speed limit", or cite the need to sell the same model in Germany, where there are no speed limits, and talk about dirty data and whitelists and people who take their car to a new track only to find that they don't have the maps update for it yet (most manufacturers do not offer OTA downloads). They will cite safety issues and whatnot. Weak claims, but since high speed crashes are relatively rare, the courts find there is no reasonable way to prevent reckless speeding while not compromising traditional legitimate uses.

Clearly, though, Japan has already moved into this kind of regulatory framework, and GPS and OTA technology is relatively new, so eventually you may a court rule for negligent design--especially as autonomous cars show the world what's easily achievable. And I suspect fear of lawsuits is why there is a "gentlemen's agreement" among manufacturers to limit top speed to 155 mph, rather than, say, 200 mph.

Citation required for low number of high speed crashes (it is very easy to google crashes of cars going 100mph), but if you are using that argument, the number of crashes caused by automakers not limiting lane keeping to divided highways is relatively low. By that logic why wouldn't "It would likely take a federal law mandating (lane keeping) governors before manufacturers would comply." apply?

Well, I should be more clear: I don't think there are many lawsuits, because it's a cultural myth that you can't create a good limiter, and no legal precent has been set. But assuming these lawsuits could proceed, you'd see far fewer reckless speeding liability claims than autopilot claims. How many people take their cars above 120 mph? One in a thousand, I'd guess. How often? Not that often. What percentage of autopilot users will engage a autopilot inappropriately? I'm going to claim 50 percent. How often? Almost all the time, because it's used for regular driving. That's at least 500x more hours/miles, likely 10000x; accidents would be far more prevalent, and the liability begins to add up. That's sufficient for multiple class action lawsuits.

I'm talking about exactly the same type of limiter. There are only going to be a limited amount of tracks in the US where you can go significantly higher than 100mph (I know this from threads discussing suitable locations to safety test the top speed of the Model S). A GPS based limiter of this type would be very easy to implement. The same reason that automakers don't implement it (and nobody expects them to) would be the same reason why people are arguing for Tesla not to put a restriction.

Clearly you're not buying my reason, so I'm at a loss: why don't automakers put in a GPS based governor?

I think you'll see a cultural shift where inappropriate autopilot use is going to be viewed a lot more like drunk driving. From the evidence we have right now, it appears that a substantial portion of the population both engages Autopilot inappropriately and is unable to control Autopilot in all situations.

If society had a magic technological switch that would eliminate all drunk driving, they would throw it. And that's what Autopilot restrictions are: a means to consistently, and reliably, prevent Autopilot from driving drunk in a situation it was never intended to handle.

That said, this argument only has weight if Autopilot is substantially less safe in some situations than human drivers acting alone. I think that's the case right now, but ten years from now it most certainly will not be, so what we have here is a temporary phenomenon, and thus temporary restrictions.

People keep saying that "other manufacturers" are doing things "cautiously" but I found that to be false (the Infiniti video is a prime example). For this specific scenario, none of the other automakers have implemented locking out of lane keeping based on divided highways, even though they have the same (or superior sensors) that can detect a divided highway. There is a double standard here in terms of Tesla.

It is a funny thing that the better an autopilot system is, the more is expected of it, because it opens peoples' eyes as to what is achievable. These other systems are so primitive that nobody uses them and nobody expects anything of them (similarly, nobody expects car manufacturers to be able to speed limit their cars). As they get better and more popular, I think you'll see more people pushing the automakers' to further improve.
 
Last edited: