Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D!

This site may earn commission on affiliate links.
My settings: 6, 15,000 AP miles in 26 states from NY to LA. Does a car cut in front once in a while? yes. If i get too upset I just hit the go pedal and drive one pedal but truth is it sets me back all of a few car lengths a day who cares? Being overly concerned about people getting in front of me isnt worth the stress, or insurance rates.

GMTA. 6 on highways, and 4 on any road under 50mph for TACC. I don't care who gets in front of me. I know that I can (and often do) overtake them when necessary.

Kudos to the OP for sharing so much about his unfortunate event and the Tesla reply. I'm still glad that I didn't wait for AP 2.0 but this thread has helped confirm my resolve to cautiously engage AP features.
 
Its not acceptable to say, as Tesla apparently did, that TACC operated as designed as it drove into the car in front of you. They need describe the details of in what way this was some rare edge case that TACC couldn't deal with as we would expect it to -- namely, to stop.

What were the special facts that led TACC to drive into the car in front of you? Rather that stop behind the car, as it does for me dozens of time each day?
 
@OP - I can just imagine your frustration at trying to discuss the logs with Tesla.

Maybe you could ask to discuss this with an Engineer at Tesla directly and perhaps get a more informed conversation.

It very much sounds as though AEB failed to operate for you, and as a safety assistance device it is entirely reasonable for you to request information as to why it failed.
I would really expect Tesla to be crawling over all the components of the system, performing tests of it, and potentially be making component replacements eg of radar to ensure you car is operting correctly.
Assistance mode or not, it is not good PR for Tesla.

The lesson to us all (which a few here seem to find difficult to comprehend) is that Driver Assistance functionality is just that. Assistance (driver responsbile) and not mandatory core (car responsible) functionality.

The sad part is that your incident shows that in all truth these assistance functions are really a long way from being dependable.
As such they are simply fun features that can indeed on balance improve safety and the driving experience but simply cannot be relied on in any circumstance.
In turn of course the less we can trust them, the less value they have.
Really we must keep in mind we are very much in first generation of this tech.
 
Gosh, such a callus response from the Tesla rep but due to the circumstances, he was in full legal defense mode from the very beginning. I'd make a poker play and "raise him", ask for the logs so you have a copy of them. If he won't release then tell him that your attorney will obtain via subpoena! You can bet your sweet A$$ that either Elon or one of his executive team members will be giving you a call to provide you with more courteous details. :)

Sit back, relax, and grab the popcorn!

Dude, the OP owned the car for just a couple days. Do you really think it's more likely that the system failed, or that a new owner just had a simple accident misinterpreting the situation or misunderstanding the capabilities of the system?

Not everything is a conspiracy.
 
I have sympathy for Sandstruck. I am not arguing that he doesn’t have legal responsibility (and neither is he) but based on the information here I think he is blameless in any moral sense that might apply. Nor is there anything he could have reasonably done to prevent the accident, other than not subjecting himself to the risk of using AP at all, which is a risk that most of us take. And I’m surprised at the number of posts that assume he must have made some mistake operating the AP system or the car. Even in systems that are statistically reliable, odd glitches sometimes happen, and I suspect this was one of these. It would be an anomaly if a complicated control system operated in real-world conditions in a fleet of cars that logs a million miles a day never failed.

I am one of those who won’t use AP with hands on the wheel (for reason’s I’ve detailed elsewhere) but I pay very good attention and keep my hands near the wheel, so I have been able to grab it and take control when autosteer does something it shouldn’t, which it has done a number of times. But that’s relatively easy to do because a sudden jerk of the car (and the wheel) is a clear signal that something has gone wrong, so my reaction time for reflexively grabbing the wheel has been much less than a second.

On the other hand, I am not so confident that either I or most of the other people here would be able to catch and override an error where the car is decelerating and thus doing what it is supposed to do (as the OP said in his original post) but not doing it quickly enough. I suspect the reaction time needed to detect a failure of degree rather than a failure of kind is much longer, probably too long to correct, particularly given that TACC has such a good record of reliability that it is reasonable for the driver to be focusing on other kinds of potential system failures than that particular one.

My conclusion is that the same result could have happened to me or to almost any of us except that we have so far avoided the OP’s sheer bad luck.

The service manager at Tesla informed me that the engineers who analyzed the logs from my car found no anomaly: the TACC performed as designed. The system triggered the emergency alarm when the collision was imminent, and the driver (me) applied the brakes after approximately one second (so much for my catlike reflexes). At the time of the accident TACC was fully functioning: no faulty camera or sensor.

I asked why the car didn't stop on its own--why was I required to slam on the brakes in the first place? The service manager told me that I bore all responsibility. I told him I accepted all responsibility. But the emergency alarm sounded only one second before the time of the collision, I explained, rendering it essentially useless. He repeated that the accident was the driver's responsibility and that he wasn't in the car to witness what happened.

I told him that I didn't remember going that fast at the time of impact (it was a fender-bender after all), and that it seemed like the car should have stopped. It wasn't at all like an emergency braking situation (until the very last second). He told me the accident was my responsibility. I may have sworn at him.

I asked about my speed at the point the emergency alarm went off and he promised to ask the engineers. "We can't know if you were going downhill, if there was ice on the pavement. We can't know any of this," he said. I apologized for being so rude.

I've had time to reflect on this minor accident and I've read all the posts in this string and my conclusion is the following: the accident was entirely my fault. I don't blame the car one iota. I wasn't careful enough and I had unrealistic expectations for the technology. After talking to Tesla, I drove home with my broken nose cone, set the distance to seven, and engaged the AP (my hands hovering above the steering wheel, my foot covering the brake). It worked beautifully.

While I cringe at the thought of diving into a divisive argument, I've changed my mind from the first post above: I am arguing that he does not have legal responsibility for his accident, Tesla does. I will assume the OP is reporting all the facts correctly. The situation was one where the car ahead was being followed by TACC, which began to slow when that car slowed. This is a situation of the kind where TACC is advertised to be usable. And now the service manager said that 'TACC performed as designed', meaning it was functioning, not impaired by a sensor problem, and not inadvertently disengaged thanks to driver error. If 'TACC performed as designed' and that led to a crash, then almost by definition there is a design error.

Here is where I need input from a tort lawyer. I am not a lawyer but listened to a series of lectures on tort law some time ago, and my recollection is -- and this is my key point -- that when a product is put out for use by the public, and is used correctly, and fails due to a design or manufacturing defect, which results a loss or injury, it does not matter how many times the company, its product manual, or its representatives say 'use at your own risk' or 'the accident is the driver's responsibility', because if that sufficed to shield companies against liability then companies could just say those things and sell shoddy products without fear. And under the doctrine of 'strict liability' in torts it also doesn't matter that the company was not negligent its efforts to design the product: the existence of the defect itself is what matters. Thus pleading it was an early-stage product does not get the manufacturer off the hook.

I am not advising the OP to do any particular thing. But I would advise Elon and company to be nice to the OP and quietly make him happy and make him whole for his loss.
 
So, after reading this thread, I will admit that I still don't know the answer to a basic question:

Is there a system on the newer Teslas that is designed to automatically stop the car instead of letting the car hit something?

It sounds like no (given what happened to the OP), but if that's right, what is Automatic Emergency Braking? And how does TACC work? It matches speed until a certain low threshold, and then just lets the car run into something in front if it goes slower than that?
 
So, after reading this thread, I will admit that I still don't know the answer to a basic question:

Is there a system on the newer Teslas that is designed to automatically stop the car instead of letting the car hit something?

It sounds like no (given what happened to the OP), but if that's right, what is Automatic Emergency Braking? And how does TACC work? It matches speed until a certain low threshold, and then just lets the car run into something in front if it goes slower than that?

I have the same question, personally I don't care if it's TACC or AP, I thought there was a basic safety package on the newer sensor S's that would prevent this. I am sure I heard that that car surrounds Itself with a bubble. You see a ton of commercials for other manufacturers showing cars reversing out of driveways and auto stopping for hazards that come into its path, so why did the S allow itself to hit another car, it surely sensed something in its way, if it didn't then something wasn't working correctly imho.
 
I took delivery of my new P90D last weekend (trading in my model 85 from 2013) and downloaded Firmware 7.1 two days ago. I had used autopilot for a few days with Firmware 7.0 and found it wonderful (astounding). Today was the first day I tried it using 7.0.

At about 8:30 AM this morning on I90 (road conditions perfect, visibility good), I was doing about 60 MPH and switched on Autopilot. I initiated a lane change with my turn signal and the car switched lanes seamlessly. My car automatically modulated my speed (with a two car distance) with the car in front of me, and I was cruising along happily when the car in front of me changed lanes and my car caught up to the car in front of him. After following this new car for a few minutes, the traffic began to slow.

My car slowed as well. But when the car in front of me came to a complete stop (not a sudden emergency stop, but rather a gradual stop), I expected my car to do the same (as it had been doing previously). It didn't. I slammed on the brakes in that dreadful instance before I realized my car wouldn't stop in time, but I still hit the car in front of me (while going maybe 5-10 MPH). I'd like to mention that I consider myself a very safe driver and have never been involved in any accident before (I'm 52). I damaged that car's rear bumper and cracked the plastic cover on my new Tesla (see attached photo).

After the police came (the other driver insisted we file an accident report) and I received a $120 ticket, I called Tesla's technical assistance. The gentleman I spoke with told me that the Autopilot function is "flawless" and that it was my responsibility, as the driver, to avoid any collision. I asked to speak with his supervisor who told me that this was the first time anything like this has ever happened, and found it "very strange." They were clearly intimating that I did something wrong.



How fast were you going before your car started to slow? How fast were you going before you applied the breaks yourself? Trying to determine if your car started to slow with the car ahead of you just didn't recognize it was stopped and didn't slow fast enough.
 
the biggest frustration i had with Tesla's service manager, besides his repeating over-and-over that i bore all responsibility for the accident (i never once claimed otherwise and at one point i asked him not to keep saying it), was his unwillingness to speculate why,
in theory, the car wouldn't stop. "Sir, i wasn't there so i can't answer that" he kept responding.

During the discussion i asked him numerous times if the logs would show if i'd inadvertantly switched off the TACC and he affirmed that it would (At one point i asked him what TACC stood for and he had to ask someone else). I proposed that perhaps the radar (or camera) temporarily lost the car in front of me, or maybe it was still tracking another car that had switched lanes and i didn't realize it until it was too late. He said those scenarios were plausible.

So my conclusion is that i must not have realized i wasn't tracking the car in front of me. The alarm that went off right before impact advised me that the car was in over its head and required me to intervene. The dilemma for me, is that cars coming into or leaving my lane have a bigger chance of confusing the system, but to minimize encroaching, i must set the time/distance lag at an unsafe interval.
 
The difference is one of us is actually using logical reasoning, the other is not. I'll leave it as an exercise to the reader to determine which is which.

There is a huge difference between the brakes failing and something related to autopilot failing. (First, it's worth noting that we don't even know if this had anything to do with autopilot at all, since the OP admits they may have disengaged it.)

In this case, autopilot did not fail. Have you read the manual and all of the caveats surrounding the functions of autopilot and TACC? I suspect you have not. It specifically states its limitations and that it can not and is not designed to be relied on to control the vehicle in all circumstances.

Now, if you read that same manual and find me a caveat that states limitations on where the brakes will not attempt to stop the car when pressed, let me know. Until then, the failure of these two things are not comparable.

And again, the driver is in complete control of the vehicle at all times. If the driver lets their car run into a semi and kill their family, how is that any different than a driver of a Honda Civic doing the same thing with regular cruise control? The driver is still in complete control. It amazes me to no end that people are still trying to argue that this is not the case. Until my car has it's own driver's license, I'm the driver.

- - - Updated - - -

I have to take this a bit further and again point out that if the driver is using autopilot as instructed and the car attempts to steer somewhere it shouldn't the driver's hands should be right there and not permit it. So the "steering under a semi" nonsense is nonsense.

Just because no one keeps their hands on the wheel doesn't mean this is how it's supposed to be used, and doesn't abdicate responsibility.

- - - Updated - - -

If I have autosteer/TACC engaged and...

I find a wire terminal I uncover on the car says "DANGER HIGH VOLTAGE", and I touch it after clearly being advised of the danger and die, it's Tesla's fault, right, because autopilot!

I get a flat tire and it makes it hard to control the car such that I drive off a cliff.... that's Tesla's fault, right, because autopilot!

I jerk the wheel to the side with my leg while fidgeting, drive off a cliff, and die... that's Tesla's fault, right, because autopilot!??!

I run over a pedestrian and kill them... that's Tesla's fault, right, because AUTOPILOT?!

I see that the car can't steer properly due to pool lane markings, and I let it, and we run off the road, fall down a cliff, smash into a small village and kill 47 people after ironically landing on top of a large oil tank... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I hit a deer... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I run over my dog... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

Obviously I can go on, but I hope the sarcasm is dripping from this post and how ridiculous it sounds to blame autopilot for basically anything. It's not for replacing the driver. Just like cruise control, just like ACC, etc etc etc and any other feature. YOU'RE STILL THE DRIVER.


This discussion is now a little overtaken by events, because taking Tesla at their word there was no defect in AP here.

That said, the hypothetical case where a defect in the AP leads to an accident is an interesting one. Let's say for the sake of argument that there is a clear and unequivocal defect in Autosteer, and that this defect causes the car to steer itself into oncoming traffic. The driver fails to notice in time, and doesn't take corrective action.

Is Tesla liable? Under the doctrine of strict product liability, I'm going to guess yes, regardless of the driver's negligence.

Consider a couple of analogous cases: a) Driver gets drunk, proceeds to speed in her Honda Accord. Runs into wall. Takata airbag deploys, but kills driver. Is the driver negligent? Yes. Was the accident the fault of the driver? Yes. Was the proximate cause of the injury a defect in the vehicle? Again, yes. As a result, there is liability for the defect, regardless of the driver's negligence.

b) Driver again gets drunk. Gets into GMC Jimmy. Doesn't buckle belt. Loses control of the car, the car rolls. A defect in the door latch allows the door to come open. The driver is ejected and killed. Is the driver negligent? Yes. Was the accident the fault of the driver? Yes. Was the proximate cause of the injury a defect in the vehicle? Again, yes--but for the defective latch, the driver would not have been ejected. As a result, there is liability for the defect, regardless of the driver's negligence.

Strict liability is a concept that can chafe at times, because it seems intuitively wrong to award damages to someone whose own negligence contributed to the injury. But as a policy matter it makes sense to identify and penalize product defects, in order to give manufacturers the incentive to correct the defects (or take steps not to introduce the defects in the first place).

The caveat to all of this is that it's been a long time since I went to law school, and product liability isn't my thing. If I've gotten any of the details wrong, feel free to correct.
 
If the driver in your second example wasn't belted then it would matter about the door latch. Could be ejected through the windshield anyway.


That's the thing about proximate cause in a case like this. Even if there are a lot of other possible ways someone *could* be injured, the relevant question is how someone *was* injured--and if the defect caused a reasonably foreseeable injury, it's irrelevant that the person in question might have been injured in some other way if things had played out differently.
 
the biggest frustration i had with Tesla's service manager, besides his repeating over-and-over that i bore all responsibility for the accident (i never once claimed otherwise and at one point i asked him not to keep saying it), was his unwillingness to speculate why,
in theory, the car wouldn't stop. "Sir, i wasn't there so i can't answer that" he kept responding.

I think this is understandable. Anything he speculates could be used against Tesla by someone and could very well be wrong speculation. If he were wrong then that might get the person in the accident even more upset.
 
Well for starters 2 doesn't mean 2 car distance. At 7 it means you will occupy the space the vehicle in front of you is in 3.5 seconds.
Please describe how you calculated that time and provide a source from Tesla supporting your calculation.
Here is what the 7.1 Model S user manual says about the TACC settings, quote:

"To adjust the distance you want to maintain between Model S and a vehicle traveling ahead of you, rotate the cruise control lever to choose a setting from 1 (the closest following distance) to 7 (the longest following distance). Each setting corresponds to a time-based distance that represents how long it takes for Model S, from its current location, to reach the location of the rear bumper of the vehicle ahead."
So the settings are a "time-based distance" but no formula is provided for calculating the time.
When I have used TACC I select a number that provides a following distance that I judge to be a safe distance based on what I was taught when I learned to drive and based on my experience driving. That means a setting of 6 or 7 at freeway speeds. I only use TACC on divided highways with no cross traffic, as per Tesla's recommendations. I do not use TACC on any other type of roads.
 
Is there a system on the newer Teslas that is designed to automatically stop the car instead of letting the car hit something?
Yes, the driver is supposed to stop the car instead of letting the car hit something, same as every car ever produced.
The car has a system called automatic emergency braking that is designed to reduce the force of an already unavoidable impact, but not to prevent the car from hitting something. It does not engage until it believes a collision to be completely unavoidable, and then it will apply the brakes to reduce speed by a maximum of 25mph and down to a minimum of 5mph before disengaging. It will also disengage if the driver takes over and does something to try to avoid the collision themselves.

TACC attempts to match the speed of a car in front, including slowing right to a stop, however it can't see all cars in all situations, and it is the driver's responsibility to take over if needed. To make certain that the driver knows this, it has been repeated over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over

To even know how to engage autosteer you need to look in the release notes which explain fully that the driver remains in control of the vehicle at all times, and that they must at all times be ready to take over control instantly. You can not physically engage Autosteer the first time without going in to the settings menu and turning it on, to do this a dialog box pops up and explains that you must remain in full control of the vehicle at all times, you have to agree to this dialog box or you will never get to use Autosteer. Even after you do that, the car wants to make certain that you really do understand this, so every single time you engage Autosteer a popup appears on the driver's display telling you this.

Driving with Autosteer engaged, and used according to the directions in the manual, the release notes, the confirmation dialog, and the popup that appears every time you engage it, will ALWAYS result in a safer driving experience than not using it. The problem is the people who ignore those instructions and sit idly by while their car drives in to something.

This is absolutely no different than setting cruise control in your 1980s car and driving in to a brick wall. The ONLY difference is that somehow this case garners sympathy for the person who did it, whereas the 1980s car we shake our heads and wonder what the driver was thinking.
 
Yes, the driver is supposed to stop the car instead of letting the car hit something, same as every car ever produced.
The car has a system called automatic emergency braking that is designed to reduce the force of an already unavoidable impact, but not to prevent the car from hitting something.

TACC attempts to match the speed of a car in front, including slowing right to a stop, however it can't see all cars in all situations, and it is the driver's responsibility to take over if needed. To make certain that the driver knows this, it has been repeated over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over

To even know how to engage autosteer you need to look in the release notes which explain fully that the driver remains in control of the vehicle at all times, and that they must at all times be ready to take over control instantly. You can not physically engage Autosteer the first time without going in to the settings menu and turning it on, to do this a dialog box pops up and explains that you must remain in full control of the vehicle at all times, you have to agree to this dialog box or you will never get to use Autosteer. Even after you do that, the car wants to make certain that you really do understand this, so every single time you engage Autosteer a popup appears on the driver's display telling you this.

Driving with Autosteer engaged, and used according to the directions in the manual, the release notes, the confirmation dialog, and the popup that appears every time you engage it, will ALWAYS result in a safer driving experience than not using it. The problem is the people who ignore those instructions and sit idly by while their car drives in to something.

This is absolutely no different than setting cruise control in your 1980s car and driving in to a brick wall. The ONLY difference is that somehow this case garners sympathy for the person who did it, whereas the 1980s car we shake our heads and wonder what the driver was thinking.
+1000
 
This discussion is now a little overtaken by events, because taking Tesla at their word there was no defect in AP here.

That said, the hypothetical case where a defect in the AP leads to an accident is an interesting one. Let's say for the sake of argument that there is a clear and unequivocal defect in Autosteer, and that this defect causes the car to steer itself into oncoming traffic. The driver fails to notice in time, and doesn't take corrective action.

Is Tesla liable? Under the doctrine of strict product liability, I'm going to guess yes, regardless of the driver's negligence.

Consider a couple of analogous cases: a) Driver gets drunk, proceeds to speed in her Honda Accord. Runs into wall. Takata airbag deploys, but kills driver. Is the driver negligent? Yes. Was the accident the fault of the driver? Yes. Was the proximate cause of the injury a defect in the vehicle? Again, yes. As a result, there is liability for the defect, regardless of the driver's negligence.

b) Driver again gets drunk. Gets into GMC Jimmy. Doesn't buckle belt. Loses control of the car, the car rolls. A defect in the door latch allows the door to come open. The driver is ejected and killed. Is the driver negligent? Yes. Was the accident the fault of the driver? Yes. Was the proximate cause of the injury a defect in the vehicle? Again, yes--but for the defective latch, the driver would not have been ejected. As a result, there is liability for the defect, regardless of the driver's negligence.

Strict liability is a concept that can chafe at times, because it seems intuitively wrong to award damages to someone whose own negligence contributed to the injury. But as a policy matter it makes sense to identify and penalize product defects, in order to give manufacturers the incentive to correct the defects (or take steps not to introduce the defects in the first place).

The caveat to all of this is that it's been a long time since I went to law school, and product liability isn't my thing. If I've gotten any of the details wrong, feel free to correct.

Going to have to echo green1's post and reference my other again.

First, the airbag is designed to do what an airbag does, which is deploy in the event of a collision. Did it do this in your example? Sounds like it, so it worked as designed. Now since the airbag is a safety thing, and is supposed to know when to deploy etc etc, if it deploys when it isn't supposed to then, well, someone has some explaining to do.

Next, the door latch is supposed to hold the door closed. Plain and simple, no caveats around that. If on a new vehicle this fails, then I could see there being an issue. If on some old rust bucket it failed, well... I think you're going to have a tough time with that one.

Again, none of these situations are comparable to autopilot where no where does it claim it's supposed to prevent an accident or supposed to slow down the vehicle in all cases. It was never designed for this, nor has it been advertised to do this. At least that's one thing Tesla has actually advertised correctly. So while door latches, airbags, brakes, etc all have a purpose that has little to no caveats on their operation nor ambiguity on liability when they fail to perform, I think the matter with TACC/autosteer is very different. I'm pretty sure someone would have a heck of a time taking Tesla to court over something like this.