Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Three recent autopilot/cruise control accidents

This site may earn commission on affiliate links.
From your lips to God's ears. If they don't people will just keep having wrecks in AP1.0 cars. Some will die. Hoping Tesla sees this and feels a responsibility to save these lives.
Two words: personal responsibility. I treat my BMW cruise control as a driver assist function only. I except full responsibility and being behind the wheel. Other should do the same.
 
I also believe 2.0 hardware will not be available as a retrofit and I am ok with that. Our 1.0 hardware can see other cars and motorcycles. As the software improves in quality, so will our lane-keeping and TACC experience.

What the two accidents involving not braking in time have in common is that the driver was not suitably overseeing the operation. Our job as driver is to take over WELL BEFORE the last possible moment to stop. Remember that a really rapid deceleration can lead to the car behind us hitting our tail end on a rapid deceleration. Thus, our job is to take over any time we feel uncomfortable with the situation. I have never personally allowed autopilot or TACC to brake really heavily to stop the vehicle in a questionable situation. I have always taken over earlier, when I saw a questionable situation developing.

In the case of the TACC accident with the van extending well into the lane, the driver needed to quickly determine whether he was going to stop or whether traffic allowed a swerve into the other lane, as the traffic ahead had done. Delaying that decision was the cause of the accident, not TACC's being placed in an ambiguous situation. TACC really didn't know the driver's plan and slamming on the brakes too early would have been problematic if the driver had the clearance to enter the other lane and pass clear of the van and planned to do so. I suspect that if the Tesla was operating under full autopilot, it would have stopped in time for the van, because there would be no ambiguity in that situation about whether the Tesla driver planned to enter the other lane, as the car ahead had done.

In the case of the driver who accidentally stepped on the brake (according to Tesla) and thereby disabled autopilot, this is again a case of the driver waiting too long before recognizing things were looking bad and beginning the heavy braking. With this accident, though, the loudness of the autopilot disengage sound could well have been a factor, but even if there was no disengage sound, the driver should have initiated braking well before there was questionable distance to stop.

Tesla says that even at this early stage of autopilot development, the number of collisions that result in airbag deployment is half the average for drivers driving without autopilot. That safety benefit will only grow as autopilot evolves. In the meantime, while autopilot is still a work in progress, use the reduced workload of not having to work so hard on lane-keeping and holding a specific speed to put more attention than in manual driving into anticipating problems ahead. You are still in charge all the time, but you have some great tools to relieve the workload so that you can concentrate on the important stuff, like questionable traffic situations and questionable highway markings.
 
  • Like
Reactions: davidc18
Good observations. Here's my concern and theory on autopilot, from my post in another thread here after using autopilot a lot on a loaner car in the past 4 days:

I think AP may create a problem with the human mind that I will coin: "Phantom Autopilot". This happens when you use autopilot a lot and you just expect the car to be driving itself, even when AP is not engaged. I don't even think better sound warnings, different graphics, etc. will make much of a difference to combat this phenomenon. I think the human brain gravitates towards patterns and reliance, and once it finds new ones, it expects them to be there, even when they are not. Well, at least my brain does. I found myself expecting the car to react even when AP was off which makes no logical sense, but it did happen and it concerns me. Maybe it's just me?

Exactly, important point! Also called mode error (Mode (computer interface) - Wikipedia, the free encyclopedia), see example on plane crash.

With an appropriate UI design mode errors in regard to AP on or off could certainly be improved (faint blue lines or blue steering wheel won't do), but probably not fully avoided.

So until fully autonomous cars can handle all situations without driver intervention we will see more accidents as the recent ones.

I just hope that Model 3 won't be released to the masses with the current system.
 
Exactly, important point! Also called mode error (Mode (computer interface) - Wikipedia, the free encyclopedia), see example on plane crash.

With an appropriate UI design mode errors in regard to AP on or off could certainly be improved (faint blue lines or blue steering wheel won't do), but probably not fully avoided.

So until fully autonomous cars can handle all situations without driver intervention we will see more accidents as the recent ones.

I just hope that Model 3 won't be released to the masses with the current system.

Mode error, from the Asiana Airlines Flight 214 crash analysis:

Over-reliance on automation and lack of systems understanding by the pilots were cited as major factors contributing to the accident.

Sounds familiar?
 
Mode error, from the Asiana Airlines Flight 214 crash analysis:


Sounds familiar?

Great point, bladerunner. The other side of the coin is that experienced airline pilots will tell you that mode error is no excuse for the type of accident Asiana experienced because the pilots are still in command and are absolutely required to keep an eye on airspeed during approach, even if they think the autopilot is in a mode that should take care of speed control. Just as with some Tesla autopilot or TACC accidents, there are some potential situations that require operator intervention, and for the operator to be oblivious to these matters or too slow to react is a necessary component for the situation to turn into an accident. In the case of the Asiana accident, nothing prevented one of the pilots in the cockpit from pushing the throttles forward when the speed became dangerously low on approach, and in the Tesla accidents, nothing prevented the driver from braking when a marginal situation was developing ahead.True, a software change to the autopilot could eliminate certain types of these accidents, but the pilot is tasked with ensuring the machine operates with sufficient speed regardless of what the autopilot system is producing and the Tesla driver is tasked with reducing speed if a questionable situation is developing ahead.

Taken one step further, the pilots in the Asiana plane needed to verify when the airspeed reached the correct speed whether the autothrottles were moving forward to hold that speed. This is a key moment where verification is needed. Similarly, there are key moments in driving a Tesla under autopilot when the operator needs to be on alert. For example, when the markings on the highway change significantly, glare on the road causes the blue edge of lane indications to blink or disappear, or when there's a curve which is sharper than you have confidence that the current version of autopilot can handle safely. In the case of TAAC, you can add (based upon the TACC accident), situations where the TACC's speed control moves the blue (traffic you're following) car designation from its previous vehicle to one which is extending partway into the traffic lane.

Part of what we can do to improve safety within the Tesla community is to discuss situations that require the driver to pay particular attention and to take over and reduce speed when in doubt. Tesla autopilot is still imperfect. We need to identify the imperfections and learn to be alert for them when we see situations developing on the road.
 
Last edited:
Sr4wrxttcs, how have you arrived at the conclusion that Tesla's radar cannot see a stopped vehicle? My experience doesn't jive with this explanation.

You mention a good point about providing some braking protection even if TACC is deactivated. Again, the tough call for braking is the orientation of the obstacle ahead. On one extreme, if the obstacle is so far to the side that it can be avoided with minimal maneuvering, then aggressive braking would likely cause more accidents than it avoids. On the other hand, if the obstacle is directly ahead of the Tesla and clearly blocking the lane, then aggressive braking would be prudent. The tough call is those situations where the obstacle is extending into the side of the lane and the driver's ability to avoid the obstacle is in question. I would think those situations are still a work in progress and will be for some time. I would think that the van which is partially extending into the lane fell into this grey area as far as TACC is concerned, because the transition from tracking the previous vehicle to providing a collision warning for the van took place with little time to spare.
There is a specific scenario where TACC does not see a stopped vehicle. If you (in Tesla) are following a vehicle (with TACC activated) and that vehicle changes lanes, TACC will not see a stopped vehicle further down in your lane. This is especially the case if the road speed is high. I have verified this behavior on two hair-raising occasions, and several other Tesla drivers experienced the same. This is a known limitation. Also: the radar is not high off the ground - if there is a little rise in the road between the radar and the target, the road can block the radar from seeing it.

Moral of the story is: don't assume. If your Tesla is approaching another vehicle faster than you would under the same circumstances, take over control immediately.
 
There is a specific scenario where TACC does not see a stopped vehicle. If you (in Tesla) are following a vehicle (with TACC activated) and that vehicle changes lanes, TACC will not see a stopped vehicle further down in your lane. This is especially the case if the road speed is high. I have verified this behavior on two hair-raising occasions, and several other Tesla drivers experienced the same. This is a known limitation. Also: the radar is not high off the ground - if there is a little rise in the road between the radar and the target, the road can block the radar from seeing it.

Moral of the story is: don't assume. If your Tesla is approaching another vehicle faster than you would under the same circumstances, take over control immediately.

AB4EJ, when you encounter this problem, are you driving with TACC only or with TACC functioning while lane-keeping is active? I suspect there's a difference in behavior between these two situations. Please let me know because I hope to try reproducing the situation and seeing for myself. Thx.
 
Last edited:
Great point, bladerunner. The other side of the coin is that experienced airline pilots will tell you that mode error is no excuse for the type of accident Asiana experienced because the pilots are still in command and are absolutely required to keep an eye on airspeed during approach, even if they think the autopilot is in a mode that should take care of speed control. Just as with some Tesla autopilot or TACC accidents, there are some potential situations that require operator intervention, and for the operator to be oblivious to these matters or too slow to react is a necessary component for the situation to turn into an accident. In the case of the Asiana accident, nothing prevented one of the pilots in the cockpit from pushing the throttles forward when the speed became dangerously low on approach, and in the Tesla accidents, nothing prevented the driver from braking when a marginal situation was developing ahead.True, a software change to the autopilot could eliminate certain types of these accidents, but the pilot is tasked with ensuring the machine operates with sufficient speed regardless of what the autopilot system is producing and the Tesla driver is tasked with reducing speed if a questionable situation is developing ahead.

Taken one step further, the pilots in the Asiana plane needed to verify when the airspeed reached the correct speed whether the autothrottles were moving forward to hold that speed. This is a key moment where verification is needed. Similarly, there are key moments in driving a Tesla under autopilot when the operator needs to be on alert. For example, when the markings on the highway change significantly, glare on the road causes the blue edge of lane indications to blink or disappear, or when there's a curve which is sharper than you have confidence that the current version of autopilot can handle safely. In the case of TAAC, you can add (based upon the TACC accident), situations where the TACC's speed control moves the blue (traffic you're following) car designation from its previous vehicle to one which is extending partway into the traffic lane.

Part of what we can do to improve safety within the Tesla community is to discuss situations that require the driver to pay particular attention and to take over and reduce speed when in doubt. Tesla autopilot is still imperfect. We need to identify the imperfections and learn to be alert for them when we see situations developing on the road.

The problem is, that Tesla drivers do not get any real training for use of the AP as opposed to pilots. After a brief rundown by the DS on the car features - which might be overwhelming for some by itself - and those drivers are let loose on the roads with AP. What they recall is the overstated marketing claims. Only a tiny minority will actually read the manual which includes most of the points you listed.

That's the problem with AP, uneducated drivers with too much reliance getting lulled into even more reliance by a system that most of the time works very well.

Those drivers are not properly prepared and their number increases every day. As a consequence AP related accidents will happen and scale up, until a regulative body or law maker will put an end to it.
 
The problem is, that Tesla drivers do not get any real training for use of the AP as opposed to pilots. After a brief rundown by the DS on the car features - which might be overwhelming for some by itself - and those drivers are let loose on the roads with AP. What they recall is the overstated marketing claims. Only a tiny minority will actually read the manual which includes most of the points you listed.

That's the problem with AP, uneducated drivers with too much reliance getting lulled into even more reliance by a system that most of the time works very well.

Those drivers are not properly prepared and their number increases every day. As a consequence AP related accidents will happen and scale up, until a regulative body or law maker will put an end to it.

Important observation. I agree.
 
Maybe some of the weekend IT meetings with Tesla staff (I forgot what it was called, there was a thread) should involve guidance on using AP with on the road experience for current (and maybe future) owners.
 
From your lips to God's ears. If they don't people will just keep having wrecks in AP1.0 cars. Some will die. Hoping Tesla sees this and feels a responsibility to save these lives.
AP reduces the incidence of accidents by 50%. You're much safer with AP than without.
Now, that still leaves some accidents. I'm sure they are working on reducing the number of accidents but you do get to the point where you can't fix stupid.
 
AP reduces the incidence of accidents by 50%. You're much safer with AP than without.
Now, that still leaves some accidents. I'm sure they are working on reducing the number of accidents but you do get to the point where you can't fix stupid.

At that point we'll need another safety system, like the one in the old joke: a dog trained to growl at the driver if he touches the controls.
 
AB4EJ, when you encounter this problem, are you driving with TACC only or with TACC functioning while lane-keeping is active? I suspect there's a difference in behavior between these two situations. Please let me know because I hope to try reproducing the situation and seeing for myself. Thx.
I have encountered the problem while using TACC and Autosteer (I guess that's the same a lane-keeping). I have not tried it with TACC only enabled. I think you might be right that there is a difference between the scenarios with and without Autosteer active.
 
The problem is, that Tesla drivers do not get any real training for use of the AP as opposed to pilots. After a brief rundown by the DS on the car features - which might be overwhelming for some by itself - and those drivers are let loose on the roads with AP. What they recall is the overstated marketing claims. Only a tiny minority will actually read the manual which includes most of the points you listed.

That's the problem with AP, uneducated drivers with too much reliance getting lulled into even more reliance by a system that most of the time works very well.

Those drivers are not properly prepared and their number increases every day. As a consequence AP related accidents will happen and scale up, until a regulative body or law maker will put an end to it.

+1 to this ^^^^^^^^^^^^^^^^^^^^^
 
AP reduces the incidence of accidents by 50%.
I think this is the entire quote: “The probability of having an accident is 50% lower if you have Autopilot on. Even with our first version.

So we can see basically what’s the average number of kilometers to an accident – accident defined by airbag deployment. Even with this early version, it’s almost twice as good as a person."

This is a total distortion of the statistical base; he has drawn an inference that is not supported by the data. There is no random test here of people driving the same roads, same conditions, same drivers with AP on/off. Elon's assertion would get him flunked in even a freshman statistics class, let alone the kind of advanced statistical analysis that should be applied to something as critical as the autopilot. What the data he quotes shows is that when AP is on, there are 50% less airbag actuations. Period, that's all. As has been discussed in multiple threads, when the going gets tough, the AP is turned off, so of course there are less accidents

Let's be glad that Elon is not running the FDA!
 
I think this is the entire quote: “The probability of having an accident is 50% lower if you have Autopilot on. Even with our first version.

So we can see basically what’s the average number of kilometers to an accident – accident defined by airbag deployment. Even with this early version, it’s almost twice as good as a person."

This is a total distortion of the statistical base; he has drawn an inference that is not supported by the data. There is no random test here of people driving the same roads, same conditions, same drivers with AP on/off. Elon's assertion would get him flunked in even a freshman statistics class, let alone the kind of advanced statistical analysis that should be applied to something as critical as the autopilot. What the data he quotes shows is that when AP is on, there are 50% less airbag actuations. Period, that's all. As has been discussed in multiple threads, when the going gets tough, the AP is turned off, so of course there are less accidents

Let's be glad that Elon is not running the FDA!
When you are sampling data, you need controls. When the entire population (Tesla drivers) are all reporting data, you don't need controls. It would be interesting to know, since they have ALL the data, is there a preference for people in various regions to engage/not engage autopilot. But in the meantime, they are perfectly able to say that Teslas on Autopilot are 50% less likely to be in accidents than Teslas not on Autopilot..