Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Fatal autopilot crash, NHTSA investigating...

This site may earn commission on affiliate links.
"She was passed and she says she was doing 85, and when this car just passed her, she was just like, wow, you know, I wonder how fast that car was going"

I think there's a distinction to be made between systems like TACC, that can only improve safety and systems like AP that may lull you into a false sense of security. If a system like TACC is only able to react in unpleasant ways then you don't start relying on it. Beeping when you leave your lane, emergency braking etc. are unpleasant enough that you will drive to avoid them happening. You will still drive like you would without TACC.

On the other hand, a system like AP that is good enough to drive 99% of the time may affect your driving. It will often slow down smoothly when there is congestion ahead. It can completely control the car for hours at a time in good conditions. It is inevitable that you will pay less attention to the road. When people say AP is relaxing is this not a way to say "I wasn't as alert"?

For a system like TACC it's fair enough to say "the technology is in its infancy", "these things are hard", "Lidar is too expensive". Even basic TACC will sometimes save lives, so it's better than nothing, and at least it never makes things worse. For AP it's not so clear. By working 99% of the time it is going to affect your driving. Can you fix that by asking people to click through disclaimers? I'm not sure that's how psychology works.

It's not just that the driver didn't see the truck in time. He didn't see it at all (brake not used). These things are big and slow so that indicates he was very distracted at the time. Ask yourself: Would the driver have driven at > 85mph while being so distracted if he didn't have AP? That seems unlikely. Perhaps human psychology means there's no middle ground for AP. Maybe you either have to be better than a human driver at all times or you shouldn't offer the feature. And when I say "at all times" I mean "assuming the driver will never be alert enough to take over at short notice".
 
Google initially built a system that relied on the driver taking control at short notice.

Urmson said Google believes that while technology that assists drivers can help reduce some accidents, only completely self-driving cars will fully address safety concerns. One reason, he said, is that the better the assistive technology gets, the more risks human drivers will take.

Nor, he said, will today’s driver-assist technologies just evolve to become fully self-driving cars on their own.

That’s like me saying if I work really hard at jumping, one day I will be able to fly,” he said.

Google initially had a model with a steering wheel to allow humans to take over if need be, but its current design — unveiled at last year’s Code Conference — is fully self-driving
Google Self-Driving Car Chief Wants Tech on the Market Within Five Years

google_self_driving_car_map.0.png


 
  • Like
Reactions: mmd
It's not just that the driver didn't see the truck in time. He didn't see it at all (brake not used). These things are big and slow so that indicates he was very distracted at the time. Ask yourself: Would the driver have driven at > 85mph while being so distracted if he didn't have AP? That seems unlikely. Perhaps human psychology means there's no middle ground for AP. Maybe you either have to be better than a human driver at all times or you shouldn't offer the feature. And when I say "at all times" I mean "assuming the driver will never be alert enough to take over at short notice".

I agree with your entire post except your conclusion.

Suppose the existing Autopilot system results in a traffic fatality rate of 1 in 150 million miles whereas without it the rate is 1 in 90 million miles. I say you use it because it is still an improvement over what we have. It's not perfect but all things considered it is better than without. People need to take personal responsibility for their actions. Barreling down the highway in 5,000 lbs. of steel at 75 MPH while texting a narrative to your BFF is just stupid - with or without Autopilot.

Your post was excellent - well thought out and articulated. I have found that everything you stated is true as it relates to my personal experience with over 20k miles using Autopilot so far. I will think about this incident next time I am using Autopilot and hopefully make the conscious decision to pay closer attention to my surroundings when using it.

Mike
 
  • Like
Reactions: Zythryn
I agree with your entire post except your conclusion.

Suppose the existing Autopilot system results in a traffic fatality rate of 1 in 150 million miles whereas without it the rate is 1 in 90 million miles.

That's attractive on the face of it, but you really have to be careful what you are comparing with. The relevant comparison is not to all miles driven, because the Tesla is a very safe car when it actually crashes. So to know if AP is helping in a statistical sense you should really be comparing with non-AP Tesla miles. Or at least cars with a comparable score on crash tests (not many of those...). At the very least you should be comparing with other TACC-equipped cars to see whether AP itself is improving safety.

But also there are a bunch of roads and situations that are so bad that nobody would even switch on AP. It's hard to know how to factor this in - there have been a lot of discussions in this thread on that point - but it seems likely that it is boosting the AP numbers.
 
  • Like
Reactions: mmd and bhzmark
I found it interesting today when a semi-truck was finishing a left turn in front of the cars stopped at the red light as the light turned green. Rather than the cars waiting for the truck to complete the turn and exit the road, the vehicle in front of me drove straight for the center of the truck and stopped next to it during the turn completion. It was a reminder of how impatient drivers can be even when they are stopped at the red light.
 
  • Helpful
  • Like
Reactions: mmd and SW2Fiddler
Wow, that news cast is typical media junk. Self driving derailed and claims that autopilot kept driving. That is a load of crap. Having driven many many miles on autopilot if the sensors aren't reading, it deactivates. The car was going very fast, and high inertia. The newscast is claiming autopilot took itself between two trees and couldn't avoid the pole. Combine media hype and drama with and people who don't know what they are talking about and you get this.

So far is seems both drivers played a role, and Josh unfortunately seemed to be speeding and possibly distracted, and the truck driver playing loose with his driving. I've seen trucks and buses regularly turn with oncoming traffic and expecting vehicles to slow down for them.

Can't say this is exactly what's happened but so far it looks like driver error. Now there's the question of complacency and taking risks because the computer is handling things. Autopilot is still safer even considering complacent drivers. It is entirely crazy to get rid autopilot, AEB and all the other systems somewhat similar.
 
That's attractive on the face of it, but you really have to be careful what you are comparing with. The relevant comparison is not to all miles driven, because the Tesla is a very safe car when it actually crashes. So to know if AP is helping in a statistical sense you should really be comparing with non-AP Tesla miles. Or at least cars with a comparable score on crash tests (not many of those...). At the very least you should be comparing with other TACC-equipped cars to see whether AP itself is improving safety.

But also there are a bunch of roads and situations that are so bad that nobody would even switch on AP. It's hard to know how to factor this in - there have been a lot of discussions in this thread on that point - but it seems likely that it is boosting the AP numbers.

Yep - I'm not claiming to know what the numbers are - but Elon claims it's safer with than without and I'm inclined to believe him barring any evidence to the contrary (I assume he sees the numbers).

Distracted driving is a huge problem right now - in cars without any driver assistance features.

Mike
 
  • Like
Reactions: Magus and Haddock
Wow, that news cast is typical media junk. Self driving derailed and claims that autopilot kept driving. That is a load of crap. Having driven many many miles on autopilot if the sensors aren't reading, it deactivates. The car was going very fast, and high inertia. The newscast is claiming autopilot took itself between two trees and couldn't avoid the pole. Combine media hype and drama with and people who don't know what they are talking about and you get this.

So far is seems both drivers played a role, and Josh unfortunately seemed to be speeding and possibly distracted, and the truck driver playing loose with his driving. I've seen trucks and buses regularly turn with oncoming traffic and expecting vehicles to slow down for them.

Can't say this is exactly what's happened but so far it looks like driver error. Now there's the question of complacency and taking risks because the computer is handling things. Autopilot is still safer even considering complacent drivers. It is entirely crazy to get rid autopilot, AEB and all the other systems somewhat similar.

That was complete nonsense. They just made stuff up. And then they closed with the obligatory ask-the-guy-on-street a question about this new fangled technology and he expresses scepticism. Total nonsense.

But who below the age of 70 watches local news anymore?

The wash post story was so much better than local tv or other fake journalists who quote and repackage fear and ignorance as news.
 
Condolences to the families involved.

That being said, frankly, this guy was driving distracted in the first video showing AP avoiding the truck coming from his left. The truck was well forward of him and he should have easily seen it and noticed it coming over into his lane before AP did. He was, apparently too busy listening to an audiobook, which of my research is correct is, "What the Dog Saw" by Malcolm Gladwell, specifically an article titled "Something Borrowed", which was one of his columns from the New Yorker dated 11/22/2004. Point being, he was distracted then and it saved him, there is evidence that he was driving too fast and may have been distracted during the fatal crash, so perhaps this was common for this guy and he got too comfortable with letting the car drive. Which is a clear violation of how it's intended to be used, but perhaps a warning or reminder for all of us.
 
  • Like
Reactions: mmd
Read up on how the Mobileye system works and how radar guided ACC works in general. Your statement shows a fundamental ignorance on how these systems work. Long story short, such software does need to ignore certain things the sensor sees because of false positives (objects, parked cars, walls on the side of the road, tunnels, hills, etc). Otherwise it'll be constantly slamming on the brakes in every road that is not relatively flat and straight!

Exclusive: The Tesla AutoPilot - An In-Depth Look At The Technology Behind the Engineering Marvel
AutoSpeed - Technology of Adaptive Cruise Control
The video makes me question the entire investigation process of the fatal accident. A reporter walking around the scene a month and a half after the accident finds and picks up a piece of the car (front headlight)? That just sounds sloppy to me.
 
"She was passed and she says she was doing 85, and when this car just passed her, she was just like, wow, you know, I wonder how fast that car was going"

I think there's a distinction to be made between systems like TACC, that can only improve safety and systems like AP that may lull you into a false sense of security. If a system like TACC is only able to react in unpleasant ways then you don't start relying on it. Beeping when you leave your lane, emergency braking etc. are unpleasant enough that you will drive to avoid them happening. You will still drive like you would without TACC.

On the other hand, a system like AP that is good enough to drive 99% of the time may affect your driving. It will often slow down smoothly when there is congestion ahead. It can completely control the car for hours at a time in good conditions. It is inevitable that you will pay less attention to the road. When people say AP is relaxing is this not a way to say "I wasn't as alert"?

For a system like TACC it's fair enough to say "the technology is in its infancy", "these things are hard", "Lidar is too expensive". Even basic TACC will sometimes save lives, so it's better than nothing, and at least it never makes things worse. For AP it's not so clear. By working 99% of the time it is going to affect your driving. Can you fix that by asking people to click through disclaimers? I'm not sure that's how psychology works.

It's not just that the driver didn't see the truck in time. He didn't see it at all (brake not used). These things are big and slow so that indicates he was very distracted at the time. Ask yourself: Would the driver have driven at > 85mph while being so distracted if he didn't have AP? That seems unlikely. Perhaps human psychology means there's no middle ground for AP. Maybe you either have to be better than a human driver at all times or you shouldn't offer the feature. And when I say "at all times" I mean "assuming the driver will never be alert enough to take over at short notice".

Very sad and I agree that AP lures you into false sense of security. My wife told me a story about one of her psychology lecture: when airbag first came out, it was daunted as a great life saving technology (which it is), but her professor claimed that instead of installing life saving airbag which cushioned the blow, a poison dart should be deployed in case of a crash. Her theory is that human will inevitably take more risk if he/she believes that there is something else to offset the added risk he/she is taking. It sounds like this is the case here.

Also, does anyone remember the "auto retract" shoulder belt from the 80s? It was also removed because a lot of people ended up with no lap belt.
 
  • Like
  • Disagree
Reactions: NOLA_Mike and Magus
That's a ridiculous argument. The sentiment to put poisonous darts in airbags is very cruel. Automobile deaths have really come down over the years, thanks in part to airbags. There are a still a lot of deaths from driving too fast, or a variety of other combinations. A lot of cars have more than basic cruise control. The driver himself talked about the limitations of AP. I regularly see people on their cellphones- without AP. People are going to engage in unsafe driving practices regardless of airbags or self driving features.
 

At 2:00 minutes forward he says "There's no way I'd ever trust a vehicle to drive me or my family."

Yet he doesn't look Amish to me. If he does actually trust a vehicle to drive him and his family, then someone needs to tell him that you can't find one safer than a Tesla to do that in. I bet the one that he's trusting now to drive his family around in has less of a safety rating than Tesla.

But we can't expert reporters to get to the true facts. That's too much work.
 
I haven't read the complete thread, so I'm sorry if this has been brought up before, but I'm quite annoyed by this part of Tesla's statement:

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

I mean, how are these figures even remotely comparable? Are they trying to tell us that the AP controlling a very safe car under very benign driving conditions is safer than a human driver who is e.g. driving on a twisting country road in pouring rain and even more safe than e.g. a human driver who drives an ancient car on a washed out road high up in the Andes?
Only miles driven by human drivers in equally benign conditions that allow the AP to operate should be used as a standard to measure the safety of a human driver vs. AP.
 
I mean, how are these figures even remotely comparable? Are they trying to tell us that the AP controlling a very safe car under very benign driving conditions is safer than a human driver who is e.g. driving on a twisting country road in pouring rain and even more safe than e.g. a human driver who drives an ancient car on a washed out road high in the Andes?

Good point. I never looked at it that way.
 
Tesla's post really could have been written better. But you also have to account for how extreme this accident was. Statistics need more numbers and power to be accurate.

For Christ's sake he seems to have been going at high speed through the undercarriage of the truck driver who seems to have made a poor driving choice. See the prior posts about truck side rails. This is a relatively extreme accident where the top of the car was obliterated driving through the undercarriage of an 18 wheeler.

You simply can't compare statistics with an n of 1 and that likely would have happened autopilot, e class, Prius AEB or not. Someone may have already mentioned a statistic of deaths involving the underside of an 18 wheeler.
 
  • Like
Reactions: DrManhattan
This is a situation wherein Brain-computer interface is needed. The compuer recognizes an object (Higher on the ground) and rejects it. The brain itself doesn't see the object (whether he was sleeping/watching movie/texting ehatever). But if the computer sends the signal to the brain, the human brain could be seeing and recognizing via cognition that it was a truck and not a overhead sig. It could happen in nano seconds. The computer should "wait" for the human brain interface to act on the outcome or the brain itself (via muscles) could clamp on the brake. Future APs, relying only on the optical camera and a computer interface is likely to fail again and again and again. US Govt and NIH is goint to spend upwards of 30 billion dollars on Brain-this good be an interesting project for Tesla and other AP makers.
 
If anything, this whole tragedy demonstrates the dangers of high speed driving, distractibility, bad driving behaviors and 18 wheeler trucks. Remember the video he took where AP avoided an accident? The vehicle if I remember correctly was a delivery truck of sorts- not 18 wheeler but still a large vehicle that maneuvered dangerously.
 

At 2:00 minutes forward he says "There's no way I'd ever trust a vehicle to drive me or my family."

Yet he doesn't look Amish to me. If he does actually trust a vehicle to drive him and his family, then someone needs to tell him that you can't find one safer than a Tesla to do that in. I bet the one that he's trusting now to drive his family around in has less of a safety rating than Tesla.

But we can't expert reporters to get to the true facts. That's too much work.

Machines can be made to be safer than humans under most conditions. When something very unexpected happens, a human may or may not behave better than a machine. When a 737 ingested a bunch of birds taking off in New York a few years ago, a quick thinking pilot managed to ditch the plane on the Hudson River and saved everyone using a maneuver nobody had ever thought of before.

On the other hand a green pilot stalled an Airbus over the Atlantic when some sensors iced up and he did the wrong thing when the autopilot told him to take control. We only know what happened because they found the black boxes. Another time a 777 landing a San Francisco hit the breakwater at the end of the runway because a green pilot came in too low.

People don't trust machines to protect them, but machines are watching all sorts of things that we take for granted and make sure we're safe. Ultimately self driving cars will be safer than human control, though even 50 years from now there will be the occasional death from something freaky happening and the computer getting confused.

Getting from where we are today to fully autonomous and safe vehicles is a big problem all in itself. Running a highway in which all cars are computer controlled is easier than dealing with a highway where some cars are computer controlled and some are not. That muddy area where we have a mix is the most difficult problem to solve.

It also doesn't help that people don't usually trust new machines when it comes to safety. Most people would freak out if they really knew how little modern airline pilots do. They are paid big bucks to sit and watch a machine fly itself and wait for something to go wrong, which rarely happens.