Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D!

This site may earn commission on affiliate links.
Maybe because nobody has released a car to the public that has autonomous driving capability yet?

- - - Updated - - -


Do you believe he would have crashed had he been in a pre-autopilot car? If yes, then I suppose we could say that he couldn't have avoided it. But the AP makes no difference in this. the AP wasn't driving, he was. AP doesn't drive, it merely assists.


If someone does not leave themselves enough time to react, they are a dangerous driver. This is simple. He's driving, he failed to brake for the vehicle ahead, he hit them. AP is 100% irrelevant here, he's scapegoating "I couldn't have avoided it" is rubbish, he clearly states there was plenty of time to avoid it, but then blames the AP for not doing so, it's not the AP's job to do so, it's his.

If what you say is true, then one should always be braking at the same time as AP would, making AP useless.
 
Two comments.

First, I wonder if the OP's settings were possibly affected (changed back to default, perhaps) by the installation of 7.1 software, without his realizing it. I think that may have happened to my car with version 7.0 If so, that might have affected the operation of either TACC or of the automatic emergency brake function. (I did not see this mentioned in the 150+ posts above, although I may have missed it.)

Second, I recall a similar-sounding circumstance in which I was using TACC (and possible auto-steer) and something changed ahead and I had to intervene because the car was not slowing down as much or as quickly as needed. I had to hit the brakes damned hard to stop in time to miss the car ahead, and was shocked to realize that TACC had not done so. Unfortunately, I do not recall enough of the details to make an explicit comparison to the OP's situation. But it does reinforce one or two other posts above that indicate that there are indeed situations in which TACC may not act as quickly as you'd want it to.
 
I think that's actually a bit too easy to do, and if you do it (brush the brake pedal, say) there is no indication except that the TACC indicator switches color. No audio sound, nothing.

I realize that's the standard for traditional cruise controls, but with TACC and other adaptive cruise systems it seems like it should give you a little bit of warning. Does the Mercedes system do anything when you disengage distronic?

Mercedes does not. However when flying all autopilots on airplanes give a loud auditory warning, and a flashing warning light is displayed for about 5 seconds, when the autopilot is disengaged.
 
Could you have swithed off autopilot without realising? The logs should be able to confirm this.

Could you have at least skimmed this thread before posting?

OK, perhaps that is a little harsh, but it's a pet peeve of mine when a poster jumps into a long thread with some obvious comment that almost certainly has been made before and assumes it hasn't been made, and clearly hasn't taken the time to read the thread.

vchadha--without going back to check, I'm going to estimate that the possibility you just raised has been mentioned in at least thirty or forty posts in this thread, and, even more importantly, the OP has actually said in more than one post that he acknowledges that it is conceivable that that is what happened.
 
Could you have at least skimmed this thread before posting?

OK, perhaps that is a little harsh, but it's a pet peeve of mine when a poster jumps into a long thread with some obvious comment that almost certainly has been made before and assumes it hasn't been made, and clearly hasn't taken the time to read the thread.

vchadha--without going back to check, I'm going to estimate that the possibility you just raised has been mentioned in at least thirty or forty posts in this thread, and, even more importantly, the OP has actually said in more than one post that he acknowledges that it is conceivable that that is what happened.

Wait wait... a thread about someone not reading the instructions getting posts by people not even reading the thread? How ironic... ;)
 
For other new owners, I suggest using autopilot with hands on the wheel and foot covering the brake, paying very close attention, for the first several hundred miles of use. Watch what it does, where it can get confused, etc. Then as you start noticing that you can anticipate how it will behave, you can relax--just a little bit.

Sage advice here. Start from a position of distrust of the automation, work out where it fails, learn how it responds normally, and then you start to learn to take over when it is not responding normally.
 
What? You realize it still would actually be 0% on Tesla, right? Not even 1% on Tesla. 0%. Tesla was not driving the car. If the brakes failed or something when the driver went to stop his new car, maybe. But the driver just let his car hurl into another car. That's not Tesla's fault no matter what.

Edit: Wow, I seem to be defending Tesla on some topics lately... whats gotten into me? Oh wait... nothing has changed. Logic still prevails. :p

It always amazes me how two people can look at the exact same set of facts and come to exact opposite conclusions, as we seem to be doing here.

How you can say if the brakes fail and cause an accident its Tesla's fault, but if AP fails and causes an accident it's not, is completely beyond me!

You know, I wonder if Tesla thinks putting "Beta" in front of AP relieves them of responsibility for it being a safe feature. I'm thinking maybe that's why some on the forum seem to take this attitude "if you're killed using AP Tesla isn't responsible". But I promise all of you, in court if the data shows Tesla's camera or radar malfunctioned causing AP to steer the car under an 18-wheeler killing a family of four, Tesla is in one hell of a bad situation - and they will pay out millions in lawsuits to the family of the victims. AND THEY SHOULD. Fear of financial penalties is the only thing motivating manufacturers to make sure their products are safe.

Here, if the logs show Tesla's equipment malfunctioned causing this minor fender bender, Tesla should (if they want to continue to be the exceptional car company better than the traditional car companys) pay for all damages to both cars. If they admit the logs show that AP failed and don't offer to cover all damages I'll be surprised, because any lawyer could walk into court and force them to. Again, if the brakes failed would anybody think lawyers would not be involved? If you were put in the hospital because your car's brakes failed, wouldn't you get a lawyer? Who wouldn't? AP is no different than brakes.
 
I can't make anything of this. It seems impulsive and misguided to report an accident like this prior to getting feedback from Tesla. The events surrounding the accident will be available for analysis.

Everybody makes mistakes, memory and perception can be faulty. It is much more likely that you had autopilot / cruise control turned off than some kind of weird system failure. Given that autopilot is being judged with restrictions in flux, I would have made darn sure I had all of the facts before I placed some "warning" on a message board.

There also may or may not be secondary gain involved...try to get Tesla to pay for it....get insurance to pay for it....keep from getting spanked for dinging daddies car...get attention...I don't know and neither does anyone else responding. The only thing I do know for sure is the initial message was impulsive and misguided and that eliminates any credibility.

If the message were placed after all of the facts were in, it could have positive value. By impulsively placing it without waiting for objective data, you have done Tesla and others a disservice.
 
It always amazes me how two people can look at the exact same set of facts and come to exact opposite conclusions, as we seem to be doing here.

How you can say if the brakes fail and cause an accident its Tesla's fault, but if AP fails and causes an accident it's not, is completely beyond me!

You know, I wonder if Tesla thinks putting "Beta" in front of AP relieves them of responsibility for it being a safe feature. I'm thinking maybe that's why some on the forum seem to take this attitude "if you're killed using AP Tesla isn't responsible". But I promise all of you, in court if the data shows Tesla's camera or radar malfunctioned causing AP to steer the car under an 18-wheeler killing a family of four, Tesla is in one hell of a bad situation - and they will pay out millions in lawsuits to the family of the victims. AND THEY SHOULD. Fear of financial penalties is the only thing motivating manufacturers to make sure their products are safe.

Here, if the logs show Tesla's equipment malfunctioned causing this minor fender bender, Tesla should (if they want to continue to be the exceptional car company better than the traditional car companys) pay for all damages to both cars. If they admit the logs show that AP failed and don't offer to cover all damages I'll be surprised, because any lawyer could walk into court and force them to. Again, if the brakes failed would anybody think lawyers would not be involved? If you were put in the hospital because your car's brakes failed, wouldn't you get a lawyer? Who wouldn't? AP is no different than brakes.

The difference is one of us is actually using logical reasoning, the other is not. I'll leave it as an exercise to the reader to determine which is which.

There is a huge difference between the brakes failing and something related to autopilot failing. (First, it's worth noting that we don't even know if this had anything to do with autopilot at all, since the OP admits they may have disengaged it.)

In this case, autopilot did not fail. Have you read the manual and all of the caveats surrounding the functions of autopilot and TACC? I suspect you have not. It specifically states its limitations and that it can not and is not designed to be relied on to control the vehicle in all circumstances.

Now, if you read that same manual and find me a caveat that states limitations on where the brakes will not attempt to stop the car when pressed, let me know. Until then, the failure of these two things are not comparable.

And again, the driver is in complete control of the vehicle at all times. If the driver lets their car run into a semi and kill their family, how is that any different than a driver of a Honda Civic doing the same thing with regular cruise control? The driver is still in complete control. It amazes me to no end that people are still trying to argue that this is not the case. Until my car has it's own driver's license, I'm the driver.

- - - Updated - - -

I have to take this a bit further and again point out that if the driver is using autopilot as instructed and the car attempts to steer somewhere it shouldn't the driver's hands should be right there and not permit it. So the "steering under a semi" nonsense is nonsense.

Just because no one keeps their hands on the wheel doesn't mean this is how it's supposed to be used, and doesn't abdicate responsibility.

- - - Updated - - -

If I have autosteer/TACC engaged and...

I find a wire terminal I uncover on the car says "DANGER HIGH VOLTAGE", and I touch it after clearly being advised of the danger and die, it's Tesla's fault, right, because autopilot!

I get a flat tire and it makes it hard to control the car such that I drive off a cliff.... that's Tesla's fault, right, because autopilot!

I jerk the wheel to the side with my leg while fidgeting, drive off a cliff, and die... that's Tesla's fault, right, because autopilot!??!

I run over a pedestrian and kill them... that's Tesla's fault, right, because AUTOPILOT?!

I see that the car can't steer properly due to pool lane markings, and I let it, and we run off the road, fall down a cliff, smash into a small village and kill 47 people after ironically landing on top of a large oil tank... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I hit a deer... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I run over my dog... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

Obviously I can go on, but I hope the sarcasm is dripping from this post and how ridiculous it sounds to blame autopilot for basically anything. It's not for replacing the driver. Just like cruise control, just like ACC, etc etc etc and any other feature. YOU'RE STILL THE DRIVER.
 
Given the number of automakers (and others!) working on autonomous vehicles, and the profusion of "driver assistance" technology that allows cars to take over more and more of the tasks associated with driving, it is insane to me that there is no federal safety standard on how these technologies are to be implemented.

Bill just introduced in the Maryland General Assembly: SB126 TASK FORCE TO STUDY ISSUES RELATED TO THE USE OF SELF–DRIVING VEHICLE Establishing the Task Force to Study Issues Related to the Use of Self-Driving Vehicles; requiring the Task Force to determine the most effective and appropriate best practices for governing self-driving vehicles based on a review of specified information; providing for the composition, chair, and staffing of the Task Force; requiring the Task Force to make specified recommendations; requiring the Task Force to report its findings and recommendations to the Governor and the General Assembly on or before January 1, 2018; etc.
 
Just for clarification TACC numerical settings are not simple distances nor car lengths. The numbers are time in seconds until the Tesla, at present speeds, will occupy the space now occupied by the vehicle in front of the Tesla.

To clarify your clarification: the number is a time setting in seconds, but "1" does not mean 1 second following distance at the present speed, and "7" does not mean 7 seconds of following distance at the present speed. (Not sure if that's what you meant--so just clarifying for others if they weren't aware).

If 7 were 7 seconds, then at 70 mph I'd be over 700 feet behind the car in front of me, which is definitely not the case!)
 
To clarify your clarification: the number is a time setting in seconds, but "1" does not mean 1 second following distance at the present speed, and "7" does not mean 7 seconds of following distance at the present speed. (Not sure if that's what you meant--so just clarifying for others if they weren't aware).

If 7 were 7 seconds, then at 70 mph I'd be over 700 feet behind the car in front of me, which is definitely not the case!)

I had been puzzled by this - I've been counting seconds until my car passes the same spot the car in front of me just passed and it never was 7 seconds - always less. Does anybody know what "7" is 7 of? It definitely isn't seconds, and its not car lengths.

- - - Updated - - -

The difference is one of us is actually using logical reasoning, the other is not. I'll leave it as an exercise to the reader to determine which is which.

There is a huge difference between the brakes failing and something related to autopilot failing. (First, it's worth noting that we don't even know if this had anything to do with autopilot at all, since the OP admits they may have disengaged it.)

In this case, autopilot did not fail. Have you read the manual and all of the caveats surrounding the functions of autopilot and TACC? I suspect you have not. It specifically states its limitations and that it can not and is not designed to be relied on to control the vehicle in all circumstances.

Now, if you read that same manual and find me a caveat that states limitations on where the brakes will not attempt to stop the car when pressed, let me know. Until then, the failure of these two things are not comparable.

And again, the driver is in complete control of the vehicle at all times. If the driver lets their car run into a semi and kill their family, how is that any different than a driver of a Honda Civic doing the same thing with regular cruise control? The driver is still in complete control. It amazes me to no end that people are still trying to argue that this is not the case. Until my car has it's own driver's license, I'm the driver.

- - - Updated - - -

I have to take this a bit further and again point out that if the driver is using autopilot as instructed and the car attempts to steer somewhere it shouldn't the driver's hands should be right there and not permit it. So the "steering under a semi" nonsense is nonsense.

Just because no one keeps their hands on the wheel doesn't mean this is how it's supposed to be used, and doesn't abdicate responsibility.

- - - Updated - - -

If I have autosteer/TACC engaged and...

I find a wire terminal I uncover on the car says "DANGER HIGH VOLTAGE", and I touch it after clearly being advised of the danger and die, it's Tesla's fault, right, because autopilot!

I get a flat tire and it makes it hard to control the car such that I drive off a cliff.... that's Tesla's fault, right, because autopilot!

I jerk the wheel to the side with my leg while fidgeting, drive off a cliff, and die... that's Tesla's fault, right, because autopilot!??!

I run over a pedestrian and kill them... that's Tesla's fault, right, because AUTOPILOT?!

I see that the car can't steer properly due to pool lane markings, and I let it, and we run off the road, fall down a cliff, smash into a small village and kill 47 people after ironically landing on top of a large oil tank... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I hit a deer... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

I run over my dog... that's Tesla's fault, right, BECAUSE AUTOPILOT?!

Obviously I can go on, but I hope the sarcasm is dripping from this post and how ridiculous it sounds to blame autopilot for basically anything. It's not for replacing the driver. Just like cruise control, just like ACC, etc etc etc and any other feature. YOU'RE STILL THE DRIVER.

I'm sorry if I upset you.

I don't agree with you at all, but there's no point in discussing it further. I hope to God none of the nightmare scenarios takes place, and Tesla keeps improving our current cars so that they are safer and safer every release. I was definitely impressed with the improvements in taking curves and not diving for exits that was clearly made in 7.1.
 
I had been puzzled by this - I've been counting seconds until my car passes the same spot the car in front of me just passed and it never was 7 seconds - always less. Does anybody know what "7" is 7 of? It definitely isn't seconds, and its not car lengths.

It's unitless. All it means is that 2 is larger than 1, so the 2 setting is a larger following distance than 1. It has NO other meaning. The difference between the highest and lowest numbers needs to be scaled up, and it doesn't fall nicely on whole numbers. If it were one to one, the settings would be something like 1.56, 2.347, 2.592, 2.811, 3.231, 3.657, 4.001231441--which would just be weird.

- - - Updated - - -

I'm sorry if I upset you.

I don't agree with you at all, but there's no point in discussing it further. I hope to God none of the nightmare scenarios takes place, and Tesla keeps improving our current cars so that they are safer and safer every release. I was definitely impressed with the improvements in taking curves and not diving for exits that was clearly made in 7.1.

Electricfan, it is fair to point out that you were perfectly comfortable reading a book while your car was on autopilot on the highway, which is relevant to the argument.
 
Did Tesla ever get back to the OP after reviewing his logs? There seems to be a lot of side discussions to pick through here.

I spoke with my local service manager about ten minutes ago who told me, "the engineers" were still analyzing the logs. Sorry if some people think my warning was irresponsible or rash , or an attempt to extort money from Tesla or to "hide a ding" from my geriatric father (ah, if only i were so young), or done for the attention. Maybe i was too impulsive and should have waited for the results. The last thing i want to do is hurt Tesla (or the resale value of my extravagant, Ludicrous purchase). Promise i'll post the results once i get them. Please be nice to each other!
 
I, and I think many others, appreciate that you shared! I think it is important no matter what the logs show. You have been on the forum long enough to know that this is how the discussions flow! It would be nice if we could play a bit friendlier!
 
Electricfan, it is fair to point out that you were perfectly comfortable reading a book while your car was on autopilot on the highway, which is relevant to the argument.

Yes, and I still do. When 7.1 came out I didn't the first few trips to work, but now I feel pretty good about it. I think its better than before. I come up on stopped traffic regularly in Houston rush hour and the car has never failed to slow and stop. But I do keep the distance on 7. (every other setting goes up to 11, why not this one? I'd use 11 if it were available)

But my feeling is, Tesla advertised and sold this car saying it would drive itself. The website doesn't use the word "autonomous", but all the definitions are really meaningless. They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that. I pay attention, but I read a few minutes, look up, read a few minutes and look up again. Anytime the car slows or makes any movement that might indicate a problem I pay attention. I try to keep a hand on the wheel, but its uncomfortable so I don't much of the time.

That's why I think this thread is so important. I wouldn't have been using 2 on the highway, but even so, if the car just completely fails to even attempt to slow when cars in front of it do, that's a bad thing. That can't happen. No matter what you think Tesla promised, sold, or requires - the car must slow if traffic ahead slows. 100% of the time. If it doesn't we need to know it.

AP is useless if you have to hold the wheel and look straight ahead every second. I don't think any reasonable person would buy the car if Tesla advertised it like that. I know I didn't assume that when I bought it.
 
Last edited:
But my feeling is, Tesla advertised and sold this car saying it would drive itself. The website doesn't use the word "autonomous", but all the definitions are really meaningless. They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that.

No they didn't. They actually say the just the opposite. What does the manual have to say? Don't you read the warning that comes up every time you activate autopilot?

Just trying to warn you before someone or something gets hurt. It would be a shame if an innocent person got killed because of blissful ignorance. If it does happen, don't say you weren't warned.