Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Blog NTSB Says ‘Operational Limits’ of Autopilot Played Part in Deadly Crash

This site may earn commission on affiliate links.
The National Transportation Safety Board said on Tuesday that “operational limitations” of Tesla’s Autopilot system played a ‘major role’ in a 2016 crash.

One person was killed when a 2015 Tesla Model S collided with semi-truck on U.S. Highway 27A. The Tesla’s Traffic-Aware Cruise Control and Autosteer lane-keeping assistance features were being used by the driver at the time of the crash. The Tesla was traveling at 74 mph just prior to impact.

The crash was the first known fatal crash involving a car using an automated driver assistance system, NTSB chairman Robert Sumwalt said.

NTSB investigators found that the Autopilot system operated as designed, but should have demanded more attention from the driver. Futher, they said measuring the driver’s touching of the steering wheel “was a poor surrogate for monitored driving engagement.”

The NTSB released a preliminary report on the crash in May. And, in June, said the driver kept his hands off the wheel for extended periods of time despite repeated automated warnings not to do so. Further, NTSB said the drivers hands were on the wheel for just 25 seconds during a 37-minute period that Autopilot was engaged.

“Today’s automation systems augment, rather than replace human drivers. Drivers must always be prepared to take the wheel or apply the brakes,” Sumalt said.

Ahead of NTSB’s report, the family of Joshua Brown issued a statement that said the car wasn’t to blame:

“We heard numerous times that the car killed our son. That is simply not the case. There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car. People die every day in car accidents. Many of those are caused by lack of attention or inability to see the danger. Joshua believed, and our family continues to believe, that the new technology going into cars and the move to autonomous driving has already saved many lives. Change always comes with risks, and zero tolerance for deaths would totally stop innovation and improvements.”

 

 
Last edited by a moderator:
I've said it once, and I'll say it again: Tesla should do so much more with the current UI/UX to try and alleviate the driver attentiveness issues.
They have two (2!) large screens, plus audio, and tactile feedback in the wheel available to them. And that is until they start doing something with the driver facing camera in the Model 3.

At least flash the whole screen, if not both screens when you want the driver's attention. Use bright colors, loud sounds. I know it would annoy the designers, and some users, but this is about safety, and “best effort”.
 
This blog post has a pretty biased selection out of the NTSB report. It makes it seems that the NTSB purely blames the driver.

However, as one of the first paragraph on the NTSB press release:

"System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt."
 
One can make the case that the NTSB is engaging in "nanny-state-ism" in suggesting that Tesla block Autopilot from working on any roads other than what it is optimized for (interstates). Many Tesla drivers have found the system to work fine on other roads (not all of them), and know how to keep an eye on the system & resume control when the car drives into an area the system can't handle.I for one am not in favor of restrictions on the system, especially based on a single incident.
 
This blog post has a pretty biased selection out of the NTSB report. It makes it seems that the NTSB purely blames the driver.

However, as one of the first paragraph on the NTSB press release:

"System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened,” said Sumwalt."
I'm not sure if you are reading the same post, but the blog does mention "NTSB investigators found that the Autopilot system operated as designed, but should have demanded more attention from the driver. Futher, they said measuring the driver’s touching of the steering wheel “was a poor surrogate for monitored driving engagement.”"

I didn't get the impression that NTSB purely blames the driver.
 
NTSB's conclusion on how much time he was holding the wheel cannot be relied upon.

In my experience the system frequently bitches at you to hold the wheel when you already are holding it. You basically have to wiggle the wheel to 50% of the limit where autopilot kicks off, and you have to do that every 10 seconds.

Probably before the accident it wasn't needing you to do this as often, but regardless it constantly false-detects that your hands aren't on the wheel, when they actually are. NTSB's conclusion about 25 seconds is not credible. There's really no way to know if he was holding it or not.
 
NTSB's conclusion on how much time he was holding the wheel cannot be relied upon.

In my experience the system frequently bitches at you to hold the wheel when you already are holding it. You basically have to wiggle the wheel to 50% of the limit where autopilot kicks off, and you have to do that every 10 seconds.

Probably before the accident it wasn't needing you to do this as often, but regardless it constantly false-detects that your hands aren't on the wheel, when they actually are. NTSB's conclusion about 25 seconds is not credible. There's really no way to know if he was holding it or not.
regardless of time spent holding the wheels a driver needs to have his focus on the road, this guy didn't.
 
NTSB's conclusion on how much time he was holding the wheel cannot be relied upon.

In my experience the system frequently bitches at you to hold the wheel when you already are holding it. You basically have to wiggle the wheel to 50% of the limit where autopilot kicks off, and you have to do that every 10 seconds.

Probably before the accident it wasn't needing you to do this as often, but regardless it constantly false-detects that your hands aren't on the wheel, when they actually are. NTSB's conclusion about 25 seconds is not credible. There's really no way to know if he was holding it or not.

By my read, being able to accurately test whether the driver is touching the wheel and frequently enforcing "touching the wheel" won't satisfy the NTSB. Their concern is that "touching the wheel" or even "holding the wheel" isn't a good proxy for what is actually important-- Paying attention to the road and not being distracted. They worry (correctly I think) that a test based on touching/holding the wheel just encourages drivers to bat at the wheel in reaction to a signal from the car, but doesn't ensure that the driver is actually paying any attention to the road.
 
  • Like
Reactions: kort677
By my read, being able to accurately test whether the driver is touching the wheel and frequently enforcing "touching the wheel" won't satisfy the NTSB. Their concern is that "touching the wheel" or even "holding the wheel" isn't a good proxy for what is actually important-- Paying attention to the road and not being distracted. They worry (correctly I think) that a test based on touching/holding the wheel just encourages drivers to bat at the wheel in reaction to a signal from the car, but doesn't ensure that the driver is actually paying any attention to the road.

Totally agree. Unless the car can monitor where your eyes are looking, there's no way to tell that you're paying any attention whatsoever.
 
part of the problem was caused by the moniker tesla chose to use, AP is a misnomer, I think something like interactive driving would have conveyed a better sense of the reality of the system.

I would bet that pilots have no problem understanding what it means. Pilots _always_ have to pay attention..

The problem comes from other people not understanding that the name was deliberately chosen to be analogous with autopilot on an airplane, and that the combination of popular use of the name and selling means that people so easily end up misunderstanding.
 
How do we know it's an understanding problem? I remember someone pointed out Joshua Brown on Youtube comments clearly was explaining to commenters the importance of paying attention.

It's more even when humans know they should do something, they are still prone to complacency or carelessness.
 
Pilots definitely momentarily take their eyes off the "road" while on autopilot. For example, they check instruments, they look at flight maps, they run checklists (especially when troubleshooting a problem). They're surely "mostly" paying attention to the sky ahead, but they do glance away and that's sort of the point of autopilot in a plane.

If Tesla's autopilot requires continuous, uninterrupted monitoring of the road ahead, then it is indeed misnamed. More like a fancy cruise control than autopilot on a plane. However, I don't think it does require that... a few seconds glancing down at the dash should be fine. Watching movies instead of driving.. not so much.
 
Actually the moniker is totally apt. The public's understanding of it must be off the mark.

Yes, including mine before pilots on this form educated me on the meaning of auto-pilot. So it seems to me that unless you're a pilot or very bright, unlike me and the general population, auto-pilot likely means to the masses that the car drives itself. And knowing that should Tesla in good conscious have used that name? My answer is no -- and I would go further and say it should be changed to "driver assist" or something similar immediately.

The fact that AP can lull you into a false sense of security on its own is bad enough without Tesla adding to it. Tesla should be doing everything it can to tell customers AP requires constant attention but instead it makes videos of the car doing things that require no driver at all.

As to who is to blame for this accident, for me it's a contest between an over-zealous automaker and a too trusting customer. One has my sympathy and the other does not -- and sympathy goes a long way in court -- despite the fact that justice is suppose to be blind to those sorts of things, it's only human nature to side with the victim. I think a good lawyer could easily convince a jury to attribute at least some fault to Tesla, which results in the damages being reduced by the percentage that the driver is contributorily negligent. The law does not require these cases to be all or nothing. It could be 50/50 following trial and if damages are $2M his estate will get $1M, But I doubt we'll hear the results of trial if there is a legal action. Tesla's insurer will likely negotiate a settlement with a NDA clause, if they haven't already, is my guess.

If you die driving into a tree with no AP and get $10.1M then clearly at least some blame can be found just about anywhere these days.
 
  • Like
Reactions: kort677
I would bet that pilots have no problem understanding what it means. Pilots _always_ have to pay attention..

The problem comes from other people not understanding that the name was deliberately chosen to be analogous with autopilot on an airplane, and that the combination of popular use of the name and selling means that people so easily end up misunderstanding.

Indeed. Driver Assist would be a much better name for the masses than Autopilot.

On the water, licensed captains and/or their designated crew are trained to scan the horizon as frequently as every 7 seconds. Yes - seconds.

And yet, every year, at least one boat impacts (often an island) at full cruising speed due to operator error. And that's with autopilot engaged.

On the water, and in the air, we know what AP is and is not. The general driving public does not. So Tesla calling what exists today "Autopilot" has not been helpful.

With specific regard to the Florida accident, anyone who's driven the type of road Mr. Brown was on knows how suboptimal having a truck turn in front of you can be. The trucker had been previously cited and did in fact turn left in front of oncoming traffic, Mr. Brown didn't react in time, and neither did the car. A tragedy any way you look at it.

I would have applauded the NTSB for recommending that Autopilot be renamed Driver Assist, but in the end, the whole point of autonomous driving is to not rely upon the driver, so the semantics end up being just that.

It's going to be a long slog from L2 to L5, folks. I look forward to the 2018 disengagement report, and hope that Tesla delivers upon its promises soon. As in actual soon - not Elon soon.

In the meantime, the truck announcement next month should be fun. For those who have seen the movie "Logan", those headless convoys are a bit unnerving but no less so than overtaking a wobbly triple-trailer arrangement on a windy Utah highway. That puts Autosteer's truck lust for those who have experienced it at 80mph firmly into the category of a truck orgy right there, yep.

No, we won't have headless convoys anytime soon. But someday...
 
  • Like
Reactions: e-FTW