Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another Model X crash, driver says autopilot was engaged

This site may earn commission on affiliate links.
I did see another theory on Reddit that the fellow could have been testing Autopilot to see what it would do if you ignored the warnings. The car started slowing down, he gave it gas and torqued the steering wheel, then let go again thinking AP was still on.

I guess I have a hard time imagining he was so asleep that neither him nor his passenger noticed the warnings.
 
  • Informative
Reactions: KaiserSoze
I can't have an accident caused by thinking that my car's autosteering is active, because it doesn't have autosteering.

So because a few people have either exhibited poor decision making or have not thoroughly in-serviced themselves, we should not have the tech?

I've probably got a couple of thousand miles of AP under my belt. It's not always perfect, but I'm there to monitor and take over. My hands might not be resting on the wheel, but are near it enough to grab quickly. I take full responsibility for my car's actions, as I believe is required in the TOS before enabling AP. I'm failing to see the point. I feel it's like AUDI and the unintended acceleration. They got creamed in the press because they didn't fight it. People were mistaking brake and gas. It can happen. People make mistakes, and can suffer from poor judgement.
 
I really cannot believe that someone so intelligent as Elon Musk is entirely missing the point here. Whether AP was actually engaged at the point of impact is irrelevant. This is still an Autopilot-related crash.

You are arguing that the crash is related to a feature that was not turned on? By your definition, aren't all Tesla crashes Autopilot related?
 
I really cannot believe that someone so intelligent as Elon Musk is entirely missing the point here. Whether AP was actually engaged at the point of impact is irrelevant. This is still an Autopilot-related crash.

I can't have an accident caused by thinking that my car's autosteering is active, because it doesn't have autosteering.

Yeah, that's completely ridiculous. I guess all of the mistaken gas for brake accidents aren't the fault of the driver, but the fact that manufacturers allow people to get confused by the two...
 
You are arguing that the crash is related to a feature that was not turned on? By your definition, aren't all Tesla crashes Autopilot related?

I guess this brings up the question of where the cutoff is. So in this case, AP was disabled 11 seconds before the crash. In the van case, AP was probably disabled a second or two before the crash when he slammed on the brakes.

Based on Teslas current definitions, they seem to consider anything where AP was disabled at the point of impact to not be AP related. I don't know if I would agree with that assessment, but I see why it's in their best interests to take that line.

I was always curious as to when the I5 rear ending driver had disabled AP (and AEB) with her brakes before the crash. We never got that timeline that I was aware of.
 
I really cannot believe that someone so intelligent as Elon Musk is entirely missing the point here. Whether AP was actually engaged at the point of impact is irrelevant. This is still an Autopilot-related crash.

I can't have an accident caused by thinking that my car's autosteering is active, because it doesn't have autosteering.

Sure you can. It happened to a Volvo driver that relied upon a system that their car was not equipped with.
 
I guess this brings up the question of where the cutoff is. So in this case, AP was disabled 11 seconds before the crash. In the van case, AP was probably disabled a second or two before the crash when he slammed on the brakes.

I respectfully disagree that there is any question. With L2, the answer is always: "its the driver's responsibilty". If both cases you cite (really, in all 3 cases we know about), if the driver had been paying attention to the road, there would have been no accident.

The only time I think you can legitimately blame AP is if there is an actual bug in the AP HW or SW, where it does not do something that it should.
 
I guess this brings up the question of where the cutoff is. So in this case, AP was disabled 11 seconds before the crash. In the van case, AP was probably disabled a second or two before the crash when he slammed on the brakes.

Based on Teslas current definitions, they seem to consider anything where AP was disabled at the point of impact to not be AP related. .

Did they say it wasn't AP-related or did they said it wasn't on at the time of the crash?

Being "technically correct" is the "best correct"!!
We have always been at war with Eastasia.

My guess is the lawyers have been spending a lot of time on the wording of these things ;)
 
  • Like
Reactions: Az_Rael
So maybe it was APs fault. Maybe after the first visual and audio warnings are ignored it should only continue visual warnings and slow down and stop the car before waking the driver up. (Assuming that the driver had dozed off.) I can see suddenly getting woken up with the car slowing down in traffic and overacting while you are still groggy/dazed could cause someone to overreact.
And, perhaps some lullabies and a glass of warm milk in case the driver wakes up too early.
 
  • Funny
Reactions: NOLA_Mike
The only time I think you can legitimately blame AP is if there is an actual bug in the AP HW or SW, where it does not do something that it should.

Fair enough. Although that will be very difficult to prove hardware/software failure since the driver is supposed to always be aware and able to take action.

I also consider the human factors effect of autopilot to be a part of autopilot, so I do consider autopilot related distracted driving to be something that may need to be looked into.
 
So basically the Tesla was doing what some have asked it to do i.e. react to lack of driver input. It was slowing down and probably wouldn't have wrecked except the driver took over and then human error intervened.

Agreed. I think there is still one improvement which could be made, though, which I'll talk about below.

Firstly, I think that job number one is to prevent a driver from becoming distracted, or incapacitated, in the first place. Once a driver has entered this state, there are much fewer options and things can go south quickly. In that respect, the driver failed. Autopilot's beeps and prods are designed to help prevent this, so in that respect, Autopilot also failed.

That's not really fair to Autopilot, though, since no matter what it does, the driver could still fall asleep, or worse. With level 2 autonomy there is little can be done, but still, there is something. I imagine this conversation between two Autopilot engineers:

A: "So, what do we do if the driver falls asleep?"

B: "We nag him awake."

A: "And if that fails?"

B: "We reduce power to the car."

A: "Great. But... you don't think he'd wake up in a panic, note that the car is slowing down, do something rash and roll the car, do you?"

B: "Well... maybe."

A: "But he deserved it."

B: "Yes."

A: "And what if someone is having a stroke?"

B: "Don't have a stroke."

To prevent this kind of situation, when disengaging, Autopilot should not allow driver input until the car has completely stopped. It's not like Autopilot was having trouble with the road. It could have continued driving, and thus would have been able to continue to stop safely. This prevents an incapacitated driver from immediately taking control of the car and doing something stupid when Autopilot decides to stop the car. After all, the driver is known to be incapacitated. They don't know exactly where they are, perhaps even what lane they are in, who is in their blind spot, how fast they are going, etc. And they may be groggy and paniced to boot. Why should they be allowed to resume driving immediately? They can do so after the car has stopped.
 
  • Disagree
Reactions: ShadowR55
Agreed. I think there is still one improvement which could be made, though, which I'll talk about below.

Firstly, I think that job number one is to prevent a driver from becoming distracted, or incapacitated, in the first place. Once a driver has entered this state, there are much fewer options and things can go south quickly. In that respect, the driver failed. Autopilot's beeps and prods are designed to help prevent this, so in that respect, Autopilot also failed.

That's not really fair to Autopilot, though, since no matter what it does, the driver could still fall asleep, or worse. With level 2 autonomy there is little can be done, but still, there is something. I imagine this conversation between two Autopilot engineers:

A: "So, what do we do if the driver falls asleep?"

B: "We nag him awake."

A: "And if that fails?"

B: "We reduce power to the car."

A: "Great. But... you don't think he'd wake up in a panic, note that the car is slowing down, do something rash and roll the car, do you?"

B: "Well... maybe."

A: "But he deserved it."

B: "Yes."

A: "And what if someone is having a stroke?"

B: "Don't have a stroke."

To prevent this kind of situation, when disengaging, Autopilot should not allow driver input until the car has completely stopped. It's not like Autopilot was having trouble with the road. It could have continued driving, and thus would have been able to continue to stop safely. This prevents an incapacitated driver from immediately taking control of the car and doing something stupid when Autopilot decides to stop the car. After all, the driver is known to be incapacitated. They don't know exactly where they are, perhaps even what lane they are in, who is in their blind spot, how fast they are going, etc. And they may be groggy and paniced to boot. Why should they be allowed to resume driving immediately? They can do so after the car has stopped.

That is a terrible idea.
 
  • Like
Reactions: ShadowR55
OK, based on the new info in the Japlopnik article, sounds like the driver fell asleep while in AP, woke up as the car was nagging him and then fell asleep again, woke up when the car started crashing. He was probably "sleep driving" when he initially took control of the car (ie. not fully awake), and probably won't remember that. That's my guess. Regardless, I do think we should not blame AP on this crash.
 
That is a terrible idea.

Could you elaborate? I assert that it might have prevented this accident. If the driver did not regain control, the car would have stopped, anyway (I presume). This tweak just ensures that the stop happens.

Do you feel that Autopilot should never stop the car when disengaging? Or perhaps it should just prevent manual steering for x seconds when it thinks the driver is waking up?
 
Could you elaborate? I assert that it might have prevented this accident. If the driver did not regain control, the car would have stopped, anyway (I presume). This tweak just ensures that the stop happens.

Do you feel that Autopilot should never stop the car when disengaging? Or perhaps it should just prevent manual steering for x seconds when it thinks the driver is waking up?

You will prevent this one class of accident, and cause a whole raft of others. First, the car doesn't know if the person is waking up from sleep or not, it cannot determine that. Second, locking out controls will open AP up to a whole host of disastrous situations: running over a pedestrians, driving off cliffs, etc. Manual control should always be FULLY available.
 
  • Like
Reactions: JohnQ and MP3Mike
You will prevent this one class of accident, and cause a whole raft of others. First, the car doesn't know if the person is waking up from sleep or not, it cannot determine that. Second, locking out controls will open AP up to a whole host of disastrous situations: running over a pedestrians, driving off cliffs, etc. Manual control should always be FULLY available.

I don't propose always locking out the driver; only when Autopilot is disengaging because the driver has not responded to alerts. In this case, the car was going to stop anyway, and thus, yes, it has to have the capability to safely stop. However, even if the stop were suboptimal, I don't think it will be at all likely that an incapacitated driver will be able to take any courses of action which are safer than the auto-stop. Thus, the forced autostop will result in the fewest accidents.

If Autopilot disengages because it can't handle the current conditions, then, of course, the driver should be allowed to take control.
 
As you know AP includes a bunch of stuff. Off the top of my head what's "beta" is Autosteer, Auto lane change, Autopark and maybe Summon. Not all of that is broken.
None of it is broken

Beta can't be broken? Its not baked.

Hey! Whats wrong with this cake?
. It hasn't been in the oven yet.
I know, but its not the cake you promised.
. Didn't you hear me? I haven't baked yet.
I know but I ate a lot of it and it made me sick.
. Are you listening to me.
Yes, I know but .........
 
  • Funny
Reactions: EVie'sDad