Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D!

This site may earn commission on affiliate links.
No. (See below.)




That's incorrect.

Here is what the manual has to say:

--
Automatic Emergency Braking does not apply the brakes, or stops applying the brakes, in situations where you are taking action to avoid a potential collision. For example:
• You turn the steering wheel sharply.
• You press the accelerator pedal.
• You press and release the brake pedal.
• A vehicle, motorcycle, bicycle, or pedestrian, is no longer detected ahead.
--

Note the "Press and release the brake pedal." So clearly simply pressing the brake pedal is a different situation, and one that does not cause AEB to not apply the brakes or to stop applying the brakes.

Thank you for pointing out relevant wording from the Manual.

I have to say, though, that I interpret it differently. The quote essentially says that AEB does not take any action (whether initiating or cancelling the process of automatic braking) in situations where driver is taking action to avoid a potential collision. The list given in the Manual is labeled "for example" which means that it is not exhaustive; so the fact that applying the brake as opposed to applying and releasing it is not listed as an example does not mean that it is not considered an indication of driver taking action to avoid collision.

So to answer my own question, it appears that application of brakes by the driver *before* EAB initiated would indeed prevent it's initiation, but if sequence of events was such that EAB initiated *first*, applying brake by the driver after that will not cause EAB to stop applying the brakes.
 
Last edited:
I've been following this thread since day one and it is appalling that people continue to attack the OP.

Also, I hate to say it but even if there was an anomaly with TACC/AP, Tesla isn't going to come out and admit it.

You may consider filing a report with NHTSA if you truly believe that there was a fault.

Disclaimer before anyone attack a me: I have an S, am an investor in TSLA (2012), enthusiast, and have an X reservation.
 
Who owns the data stream flowing back from OP's vehicle?

OP looks like a reasonable guy to me, still struggling to understand exactly what happened in a complicated human/machine interaction, and the service manager's response (per OP's report) seems quite unsatisfying. Forget legal responsibility, moral responsibility, reams of documented disclaimers -- I'll stipulate that OP has 100% responsibility.

However, as an (ex) engineer, current P85+ owner, and likely future owner of a P90DL+AP descendant (depending on what's available this coming fall), I'd like to know how a vehicle equipped with TACC rolls into a vehicle in front of it in a situation that sounds an awful lot like one where TACC should be able to come to a stop (and which per other reports routinely DOES). Which of the exceptions that Tesla Engineering already knows about applies to this situation, so that the system is still supposedly performing as designed? Locked onto the wrong car? Speed lower than some limit? Speed higher than some limit? System disengaged? I can build a system with documented exclusions for how it is to be operated -- OK, realistically, I can ask someone else to build it for me :) -- but then, when I'm looking at a behavior in this deployed product, I (or my engineering representative) should be able to explain how that behavior happened and why it accords with the system's design parameters.

The service manager claims the system is performing as designed. Fine. Then someone, somewhere at Tesla should be able to explain, in adequate detail, so that another engineer can follow the explanation and reach the same conclusion.

Alan
 
I'm glad this post will be buried far back in the thread, but to the OP, it seems that you're not alone. Here's a Dutch article about a Tesla that slammed into the back of a truck at 80kph while supposedly on Autopilot. The details are scant, and it's written in Dutch so as always, something is lost in the (Google) translation, but it's the first report of a serious accident while on autopilot that I've seen.

accident 1.JPG
accident 2.JPG
 
Last edited:
Why didn't TACC stop the car -- as it otherwise does a million times a day in similar situations.

What was unique in this case?

When Tesla figures that out they can then further improve the software to better handle those situations (as they have already done with exits and curves and jersey walls etc.). Meanwhile, if we know what those circumstances are, we can be more alert and prepared to take over in those situations.

What was unique about this situation?
 
Going to have to echo green1's post and reference my other again.

First, the airbag is designed to do what an airbag does, which is deploy in the event of a collision. Did it do this in your example? Sounds like it, so it worked as designed. Now since the airbag is a safety thing, and is supposed to know when to deploy etc etc, if it deploys when it isn't supposed to then, well, someone has some explaining to do.

Next, the door latch is supposed to hold the door closed. Plain and simple, no caveats around that. If on a new vehicle this fails, then I could see there being an issue. If on some old rust bucket it failed, well... I think you're going to have a tough time with that one.

Again, none of these situations are comparable to autopilot where no where does it claim it's supposed to prevent an accident or supposed to slow down the vehicle in all cases. It was never designed for this, nor has it been advertised to do this. At least that's one thing Tesla has actually advertised correctly. So while door latches, airbags, brakes, etc all have a purpose that has little to no caveats on their operation nor ambiguity on liability when they fail to perform, I think the matter with TACC/autosteer is very different. I'm pretty sure someone would have a heck of a time taking Tesla to court over something like this.

I agree that establishing a defect in AP would be more difficult than establishing one in a door latch. I don't think it's impossible; off the top of my head, if the AP failed to disengage, eg, that would be a pretty clear defect.

My only point was that the question of liability for a defect (assuming you could show one) is separate from the question of the driver's negligence.
 
Why didn't TACC stop the car -- as it otherwise does a million times a day in similar situations.

What was unique in this case?

When Tesla figures that out they can then further improve the software to better handle those situations (as they have already done with exits and curves and jersey walls etc.). Meanwhile, if we know what those circumstances are, we can be more alert and prepared to take over in those situations.

What was unique about this situation?

See? This is what I mean. Up thread green1 jumped all over me for asking if the car stops itself, saying of course it doesn't. Yet this post seems to suggest that it does.

I don't have AP in my car, but I am curious about why there's such confusion about what seems like a pretty basic question.
 
Visibility good, but were you driving east into low-angle morning sun? That could confuse it.
I don't want to pry or be too weird about this, but at 8:30 am CT 1/13 in Chicago the sun's altitude was 10 degrees and azimuth 132. You were on I90, the Kennedy Expressway, presumably driving into town with the rush, where you would have heading 135 degrees (SE) most of the trip and the sun would have been almost exactly dead ahead. Chicago reported as 'partly cloudy' at 8:30. If AP was looking into the sun, that might be the explanation.
 
Thank you for pointing out relevant wording from the Manual.

I have to say, though, that I interpret it differently. The quote essentially says that AEB does not take any action (whether initiating or cancelling the process of automatic braking) in situations where driver is taking action to avoid a potential collision. The list given in the Manual is labeled "for example" which means that it is not exhaustive; so the fact that applying the brake as opposed to applying and releasing it is not listed as an example does not mean that it is not considered an indication of driver taking action to avoid collision.

So to answer my own question, it appears that application of brakes by the driver *before* EAB initiated would indeed prevent it's initiation, but if sequence of events was such that EAB initiated *first*, applying brake by the driver after that will not cause EAB to stop applying the brakes.

I can understand your interpretation. It could be the right one.

On the other hand, for the list of examples given, why would Tesla have included the "and release" with respect to the brake pedal, if just pressing it was enough?

I would like to think that the system is sophisticated enough that if it senses that you are braking, but not braking enough, that it will brake more, and that the "brake and release" exception is in place for a situation where a driver may have decided to brake, but then, for some reason has determined braking isn't the best course of action, so is trying something else.

I think either one of us could be correct. I expect the only way to find out the real answer will be to have Tesla tell us, but depending on who at Tesla answers the question, I don't know how much stock we'll be able to put into the answer.


Didn't realize it said in the manual was a brake press and release that overrides AEB. Wonder if it is actually implemented that way in the current software? That could be easily verified...

I'm wondering how you think we could go about easily verifying this. Since AEB is only supposed to kick in when a frontal collision is unavoidable, I don't see this as something that we could easily test.
 
Last edited:
Going to have to echo green1's post and reference my other again.

First, the airbag is designed to do what an airbag does, which is deploy in the event of a collision. Did it do this in your example? Sounds like it, so it worked as designed. Now since the airbag is a safety thing, and is supposed to know when to deploy etc etc, if it deploys when it isn't supposed to then, well, someone has some explaining to do.

Next, the door latch is supposed to hold the door closed. Plain and simple, no caveats around that. If on a new vehicle this fails, then I could see there being an issue. If on some old rust bucket it failed, well... I think you're going to have a tough time with that one.

Again, none of these situations are comparable to autopilot where no where does it claim it's supposed to prevent an accident or supposed to slow down the vehicle in all cases. It was never designed for this, nor has it been advertised to do this. At least that's one thing Tesla has actually advertised correctly. So while door latches, airbags, brakes, etc all have a purpose that has little to no caveats on their operation nor ambiguity on liability when they fail to perform, I think the matter with TACC/autosteer is very different. I'm pretty sure someone would have a heck of a time taking Tesla to court over something like this.

Based on your work in the hacking thread do you know what the logs that Tesla pulls from the OP's car would show? Is there enough detail that they would know what happened? I'm curios if they can make correlations from the various data inputs to actually help out here.
 
I don't want to pry or be too weird about this, but at 8:30 am CT 1/13 in Chicago the sun's altitude was 10 degrees and azimuth 132. You were on I90, the Kennedy Expressway, presumably driving into town with the rush, where you would have heading 135 degrees (SE) most of the trip and the sun would have been almost exactly dead ahead. Chicago reported as 'partly cloudy' at 8:30. If AP was looking into the sun, that might be the explanation.

I would expect AEB to use the radar not the camera. So no, I wouldn't expect the position of the sun to affect this. Constructive line of thinking though
 
I would expect AEB to use the radar not the camera. So no, I wouldn't expect the position of the sun to affect this. Constructive line of thinking though

This wasn't an AEB failure. TACC was in use. TACC, if it was operating properly and if t was locked on the car that was rear-ended as the target car, should, theoretically, have stopped the OP's car.

AEB is only supposed to kick in when a frontal collision is unavoidable, and then only to reduce the impact of the collision, not to completely prevent it.
 
I can understand your interpretation. It could be the right one.

On the other hand, for the list of examples given, why would Tesla have included the "and release" with respect to the brake pedal, if just pressing it was enough?

I would like to think that the system is sophisticated enough that if it senses that you are braking, but not braking enough, that it will brake more, and that the "brake and release" exception is in place for a situation where a driver may have decided to brake, but then, for some reason has determined braking isn't the best course of action, so is trying something else.

I think either one of us could be correct. I expect the only way to find out the real answer will be to have Tesla tell us, but depending on who at Tesla answers the question, I don't know how much stock we'll be able to put into the answer.

I think that the intent of the write-up in the Manual was to indicate that AEB action (whatever it might be) is prevented in situations when driver is taking action to avoid a potential collision. The sample list of such actions includes breaking and *releasing* the brake because this could be interpreted as driver *not* taking action to avoid collision, while according to the Tesla AEB algorithm it *is* considered a sign of such action, so to eliminate ambiguity they listed such "press and release" action by driver explicitly. It seems that driver pressing and holding brake pedal, on another hand, unambiguously means that he is taking action to avoid collision, so it was not included in the list of examples.

- - - Updated - - -

This wasn't an AEB failure. TACC was in use. TACC, if it was operating properly and if t was locked on the car that was rear-ended as the target car, should, theoretically, have stopped the OP's car.

AEB is only supposed to kick in when a frontal collision is unavoidable, and then only to reduce the impact of the collision, not to completely prevent it.

I also think that AEB was not engaged during this incident. If EAB system would have been engaged the car would have had "emergency Braking in Progress" warning displayed on the dash, which OP did not mention in his description of the incident.
 
This wasn't an AEB failure. TACC was in use. TACC, if it was operating properly and if t was locked on the car that was rear-ended as the target car, should, theoretically, have stopped the OP's car.

AEB is only supposed to kick in when a frontal collision is unavoidable, and then only to reduce the impact of the collision, not to completely prevent it.
Andy, do you know whether TACC is radar-only? If so, my sun-in-the-eye theory is out the window.
 
Andy, do you know whether TACC is radar-only? If so, my sun-in-the-eye theory is out the window.

I'm sorry, I don't.

I do know that there has been discussion that some of the "Driver Assistance Unavailable" messages that impact TACC and other features may at times be caused by condensation in front of the camera, but I don't know if that has ever been conclusively proven as a cause for the error, and even if it has been, that error message applies more broadly than to just TACC.