Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D!

This site may earn commission on affiliate links.
Yes, and I still do. When 7.1 came out I didn't the first few trips to work, but now I feel pretty good about it. I think its better than before. I come up on stopped traffic regularly in Houston rush hour and the car has never failed to slow and stop. But I do keep the distance on 7.

But my feeling is, Tesla advertised and sold this car saying it would drive itself. The website doesn't use the word "autonomous", but all the definitions are really meaningless. They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that. I pay attention, but I read a few minutes, look up, read a few minutes and look up again. Anytime the car slows or makes any movement that might indicate a problem I pay attention. I try to keep a hand on the wheel, but its uncomfortable so I don't much of the time.

That's why I think this thread is so important. I wouldn't have been using 2 on the highway, but even so, if the car just completely fails to even attempt to slow when cars in front of it do, that's a bad thing. That can't happen. No matter what you think Tesla promised, sold, or requires - the car must slow if traffic ahead slows. 100% of the time. If it doesn't we need to know it.

AP is useless if you have to hold the wheel and look straight ahead every second. I don't think any reasonable person would buy the car if Tesla advertised it like that. I know I didn't assume that when I bought it.

Wow... um.... wow. So yeah, as suspected, did not RT*M.

I'll keep an eye out for your car on the salvage market when you inevitably total it. Good luck blaming it on Tesla lol. *archives this post* (grabbed it with the Way Back Machine just to make sure the evidence is here for a future accident investigation: WARNING: I rear-ended someone today while using Auto Pilot in my brand new P90D! - Page 18 )

- - - Updated - - -

Conversation at scene of accident:

Officer: So, what happened?
Electricfan: Well, I was reading a book, and...
Officer: You have the right to remain silent...
 
Last edited:
How is it, that despite everything Elon has said, everything Tesla has said, all the warnings in the manual, and the release notes, and the warning you have to acknowledge before enabling autopilot in the menus, and the pop-up message that appears every single time you engage it. Some people still think that the car drives itself????

This is just a mind-boggling phenomenon, I simply don't understand how anyone could come even remotely close to that idea.

Everyone who has posted anything referring to the car as self-driving, or referring to autonomous vehicles, past, present, or future, stop it!!! There is no autonomy involved in this situation.

As I've said before, there is only one possible way for Autopilot to EVER be responsible for a collision in it's current form, and I've never once heard anyone even hint at it having happened in any collision or close call, and that would be if the autopilot somehow prevented the driver from taking over. As that didn't happen, the autopilot has zero responsibility.

for the person suggesting that if you have to pay attention and be ready to take over there's no point, fine. don't use AP then. To the one saying you'd have to always use the brakes at the same time as AP, not at all, you just have to have your following distance set to something safe that allows you time to take over if the AP doesn't get to it first.
 
I spoke with my local service manager about ten minutes ago who told me, "the engineers" were still analyzing the logs. Sorry if some people think my warning was irresponsible or rash , or an attempt to extort money from Tesla or to "hide a ding" from my geriatric father (ah, if only i were so young), or done for the attention. Maybe i was too impulsive and should have waited for the results. The last thing i want to do is hurt Tesla (or the resale value of my extravagant, Ludicrous purchase). Promise i'll post the results once i get them. Please be nice to each other!

Thanks for sharing, it is beneficial to all to share their experiences on Autopilot. Sharing promotes and speeds up learning.


As few people pointed out, even if autopilot failed the driver is fully responsible for the car operation. Only car failure such as failure to steer or brake could be attributed to Tesla, but that did not happen in your accident.


For that reason, some people, including myself, might object to the phrasing of this thread title. The title is attention grabbing, it implies autopilot failure caused the accident and could easily be exploited to hurt Tesla.


If a thread title aspires to grab attention, like your does, then it must be semantically accurate. If the title is attention grabbing and inaccurate, then it is quite unfair and potentially damaging someone's reputation (undeservedly). That is how I see it.


More accurate thread title would be something along the lines of:


WARNING: What not to do whilst on autopilot: I just rear-ended someone's car because my attention dropped whilst driving on autopilot and I failed to stomp on brake in time


Or similar.


Some people might argue that I am being semantic here, but in this case,
the devil is in the detail (thread title).


If anyone posted attention grabbing thread title that inaccurately implied my responsibility for the accident they caused, I would consider that poster to be hostile towards me.
 
Last edited:
Thanks to the OP, for the initial post, and the followups on this thread. While it's not always comfortable dealing with reality, and being objective in describing negative events, it's always for the greater good. Basing replies on subjective or biased information doesn't really promote knowledge and understanding.

Several thoughts and comments popped out to me while reading all of the posts, but I waited until finishing the latest post before responding.

The worst drivers, that I have experienced, for following too close were those on the freeways in Michigan. My cousin, who is a resident there, said I needed to follow closer, or ALL of the drivers would shoe-horn in, in front of me. I would be traveling between 65 and 70 mph, with TACC on 4, in the slow lane, and it was scary. I generally run TACC at 4 or 5, although slow traffic might be set as low as 2, and fast, aggressive traffic merits a 6 or 7. I stay out of the way, if possible.
My indications, with a stopwatch, showed my TACC settings equate to 1/2 second increments from where I am, to the point where the current vehicle being tracked is at.

My MS is in the Fremont SC right now (2nd drive train replacement in ~15 months and 55,000 miles), and the front passenger proximity sensor being reseated or replaced. I drive quite a few miles, and Autosteer and TACC are helpful, but the driver still needs to be the 'Command Driver. 7.1 will be installed before I get my MS back, and I look forward to seeing what it entails.

We hear about situations that appear on the news, or in internet postings, but are not always in context, are biased, or smack of sensationalism. When a few MS have caught fire, it makes national or international news. And, it's apparent that news and media report it as a major problem with electric cars. The recent news concerned the MS that flambe'd while charging in Norway. Even the firefighters included their comments about lithium content of the battery pack. However, what was not talked about was any hazmat fire, or the dangers of flamable plastics, etc. There are numerous vehicle fires in the US every day; they are not as newsworthy unless the vehicle is a Ferrari, Lamborghini, or AMG. A fire involving a common ICE vehicle is generally reported in Traffic Reports as just a vehicle fire, much like a car broken down, impacting traffic.

My MS, with 7.0, autosteered much better in Arizona, than in California. I attribute it to the very good lane markings. I also noted that the swerving to take an exit in California was due to the total lack of markings across the exit lane; Arizona has many smaller segmented lane marking, that Autosteer would recognize, and not try to take the exit. I didn't initially like the new dash diplay with 7.0, but it's very nice to be able to see if the lane markings are being recognized and displayed so as to tell if a problem could be imminent.

I believe that Tesla should increase the visual and audible indicators that denote Autosteer and TACC turning off. Like others, including the OP, it would be interesting if it was disabled prior to the collision.

I am not happy that a few screwy drivers decided to post stupid video's of Autosteer / TACC on Youtube. Clearly, if Darwinism doesn't weed them out, law enforcement will. Unfortunately, EM stepped in, instead.

Technology is improving and appears to be decreasing accidents. Even with safety systems such as antilock brakes, supplemental restraint system, better tires, etc, there are still accidents every day, all over the world. If I am in an accident, no matter where the fault lies, I am happy that I'm in a very safe Tesla. Doesn't preclude the accident, not does it eliminate possible injury or death, but it decreases the odds of injury and death. When I received driver training (more than 40 years ago), we were instructed that there were ways to mitigate injury, but not necessarily to eliminate it. They varies from worst scenario (high speed head on between 2 vehicles), and then moving to different scenarios to improve your odds of survival. These went from 2 cars head on, 1 car info a stationary object or vehicle (yes, whiplash is a consideration), glancing blow, to low speed bumper to bumper. But, in my head, I want to survive, and not negatively affect any others involved. If my car is totaled, that's of little concern to me. Vehicles can be replaced. Life and limb, not so easy.

I am awaiting 7.1, as well as other future upgrades. I remain aware of the limitations of technology. I cringe at the reference to Autopilot. I retired from the FAA, and in aircraft autopilot implementations, as in pretty much any critical system, there is redundancy. When a system (or engine) fails, redundancy usually will prevent a negative outcome. Multiple systems, integrated and designed for the scope of flying, work very well. There's still accidents, but redundant monitor and control leads to pretty reliable air travel. Critical FAA systems on the ground, like Instrument Landing Systems, radio equipment, power sources, are all similarly implemented. Now, Tesla is not really equipped for Autopilot operation. Assistance, yes, but the driver is still in control and legally responsible.
An interesting note was the death of golfer Payne Stewart, when the jet he was flying in suffered depressurization, leading to oxygen deprivation, and yet it continued at the assigned altitude and heading, until running out of fuel. TCAS (Traffic Collision Avoidance System) might have reacted if a situation occurred, but in this case, there was no conscious pilot to handle the situation. The point is, a pilot, even with redundant flight systems, is still the pilot in command, and is ultimately responsible. I think it is important to be planning for the best, but be prepared for the worst. Remember, your MS does not know what you know, and might not arrive at the same conclusions you would.

Drive safe!

Scotty


Any comments about FAA systems are based on published reports.

- - - Updated - - -

There's a few systems, such as Autosteer / TACC , Collision Alert / Blind Spot Indication, AEB, etc, but it's Not AutoPilot!

Scotty
 
Christ, so much B$ rhetoric throughout this entire thread. All the OP intended to do was to share his interpretation of the situation he experienced and nothing more. He admitted he was at fault, provided what I feel is factual information sprinkled with his view of assumptions around an unknown result until Tesla can provide more information or evidence with regards to what happened. Instead, a bunch of useless attacks and a complete waste of energy around blame, irresponsibility, and disrespectful comments are being conveyed against a very unselfish individual who was trying to come here for some advice and to find out if others had experienced a similar result. I believe he was simply just trying to spread awareness and gain input from others who may have experienced a similar outcome.

To be completely candid, Is all of this rhetoric useful, constructive, and supportive to someone who has just had an incredibly terrible week? Where's the human compassion from others around this unfortunate mishap? Aren't we all here to garner advice, share experiences, seek knowledge, and build comradery with other individuals who have similar vested interests? I personally feel very humbled and fortunate that I have the resources to acquire an expensive luxury vehicle but disappointed to see all of the harsh criticism towards a Tesla brethren.

Peace! :)
 
It's unitless. All it means is that 2 is larger than 1, so the 2 setting is a larger following distance than 1. It has NO other meaning.
That is my understanding as well. It's like the car's audio volume control which goes from 1 to 11. Those values have no units, they are just intervals over a range.
So the TACC setting numbers have no inherent meaning. Select a number based on your judgement as to how much distance you need to stop your car safely if the car ahead of you slams on its brakes. My recommendation is that at freeway speeds a setting of 6 or 7 is reasonably safe. Anything less is too risky for me. In stop and go freeway traffic a lower setting is reasonable.
 
There's a few systems, such as Autosteer / TACC , Collision Alert / Blind Spot Indication, AEB, etc, but it's Not AutoPilot!
I disagree, Autopilot is a very apt analogy, Autopilot systems don't generally do their own routing, or their own traffic avoidance, that's left to the pilot, Autopilot systems are designed to be monitored at all times by the pilot who remains in full control of the aircraft. That's remarkably like Tesla's autopilot. Nobody ever wants to see the pilot of their airliner in the back with the passengers while the flight deck is empty.

Generally I feel that anyone who thinks Autopilot implies more autonomy than Tesla provides, probably doesn't realize how little autonomy most aviation autopilot systems really provide.
 
Yes, and I still do. When 7.1 came out I didn't the first few trips to work, but now I feel pretty good about it. I think its better than before. I come up on stopped traffic regularly in Houston rush hour and the car has never failed to slow and stop. But I do keep the distance on 7. (every other setting goes up to 11, why not this one? I'd use 11 if it were available)

But my feeling is, Tesla advertised and sold this car saying it would drive itself. The website doesn't use the word "autonomous", but all the definitions are really meaningless. They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that. I pay attention, but I read a few minutes, look up, read a few minutes and look up again. Anytime the car slows or makes any movement that might indicate a problem I pay attention. I try to keep a hand on the wheel, but its uncomfortable so I don't much of the time.

That's why I think this thread is so important. I wouldn't have been using 2 on the highway, but even so, if the car just completely fails to even attempt to slow when cars in front of it do, that's a bad thing. That can't happen. No matter what you think Tesla promised, sold, or requires - the car must slow if traffic ahead slows. 100% of the time. If it doesn't we need to know it.

AP is useless if you have to hold the wheel and look straight ahead every second. I don't think any reasonable person would buy the car if Tesla advertised it like that. I know I didn't assume that when I bought it.

Man, I avoided wading into this when you posted about your biblioventures last time around, but into the fray!

The level of cognitive dissonance you're willfully demonstrating is, and I'm not being dramatic here: ****ing terrifying. Your "feeling" is that Tesla advertised that the car would drive itself, and that definitions are meaningless. That's cognitive dissonance strike #1: this isn't the King James Version of the Bible. It's not a matter of interpretation where you decide to read it differently than other people. You state "They advertise the car will drive SAFELY on a divided highway and stay in its lane in full control of the brakes, accelerator and steering wheel. I'm trusting them to do that."

They literally advertise the car as being entirely your responsibility at all times. In fact, they state it explicity:

Never depend on these components to keep you safe. It is the driver's responsibility to stay alert, drive safely, and be in control of the vehicle at all times.

Warning: Traffic-Aware Cruise Control can cancel unexpectedly at any time for unforeseen reasons. Always watch the road in front of you and stay prepared to take appropriate action. It is the driver's responsibility to be in control of Model S at all times.

You say right in your post "the car must slow if traffic ahead slows. 100% of the time. If it doesn't we need to know it."

LISTEN TO ME VERY CAREFULLY: YOU KNOW IT NOW. IT DOES NOT SLOW 100% OF THE TIME. PLEASE STOP ENDANGERING OTHER PEOPLE BY WILLFULLY NEGLECTING THE VERY CLEAR WARNINGS IN THE MANUAL AND CAR UI THAT EXPLICITLY TELL YOU THESE FACTS YOU REFUSE TO ACKNOWLEDGE.

It feels more like you have buyer's remorse because you were very excited at being sold on a future that isn't (in your estimation) quite accurately portrayed, but instead of being upset at Tesla for misrepresenting the capabilities of their vehicle, you're just determined to drive like an idiot until it proves you wrong in the stupidest way possible. I just pray that someone in another vehicle isn't hurt by your neglect to follow every instruction provided to you, because you once swallowed a brochure.
 
Last edited:
Stuff happens, don't beat yourself up. That's why they call it an *accident*.
A car crash is NOT an accident. Close to 99% are due to stupidity on the part of one or more people operating 4000 lb deadly weapons in an irresponsible way.

Each year in the U.S. nearly 5,000 people who are on foot are killed by people driving cars. 92% of these are completely the fault of the driver. Over 700 people riding bicycles are killed by people driving cars each year in the U.S. (about 55% are the fault of the driver, 20% the fault of the bicycle rider, and 25% no fault determined).

Today in the U.S. about 12 people will be killed by people driving cars. Should these drivers not beat themselves up over killing someone? Are we so callous that killing someone or their mother or their child is nothing to be concerned about? These are not accidents.
 
A car crash is NOT an accident. Close to 99% are due to stupidity on the part of one or more people operating 4000 lb deadly weapons in an irresponsible way.

Each year in the U.S. nearly 5,000 people who are on foot are killed by people driving cars. 92% of these are completely the fault of the driver. Over 700 people riding bicycles are killed by people driving cars each year in the U.S. (about 55% are the fault of the driver, 20% the fault of the bicycle rider, and 25% no fault determined).

Today in the U.S. about 12 people will be killed by people driving cars. Should these drivers not beat themselves up over killing someone? Are we so callous that killing someone or their mother or their child is nothing to be concerned about? These are not accidents.

Chill out, you know he didn't KILL anyone so there's no reason to repeat that a dozen times. It was an unintended, unexpected 3 mph fender bender. No one got hurt. That is the definition of an accident. So to your question; should he not beat himself up over killing someone? Since we are only talking about this case the answer is an emphatic NO because he didn't kill or even injure anyone.
 
Keep in mind that there is a vast difference in brake failure and TACC failure. Brakes are fairly simple from an operational/expectation standpoint — you press the pedal and they then increase drag on the rotation of the wheels and assuming good tires and dry pavement will slow the car. There is effectively one use case for brakes — pressing the brake pedal.

TACC is very different, massively more complicated, and must cover thousands or perhaps tens of thousands of use cases. How can Tesla or anyone know when they have covered EVERY potential instance? At what point can Tesla say that they've covered enough scenarios that drivers can 100% (or 99.999999%?) rely on TACC to slow/stop the car effectively in every instance, including the OP's, when it should?
 
A car crash is NOT an accident. Close to 99% are due to stupidity on the part of one or more people operating 4000 lb deadly weapons in an irresponsible way.

Each year in the U.S. nearly 5,000 people who are on foot are killed by people driving cars. 92% of these are completely the fault of the driver. Over 700 people riding bicycles are killed by people driving cars each year in the U.S. (about 55% are the fault of the driver, 20% the fault of the bicycle rider, and 25% no fault determined).

Today in the U.S. about 12 people will be killed by people driving cars. Should these drivers not beat themselves up over killing someone? Are we so callous that killing someone or their mother or their child is nothing to be concerned about? These are not accidents.

I feel like I've just been run over by a car-load of statistics....

But seriously, I think we do need to take this as a reminder to be ever-vigilant when it comes to AP. And exponentially so, the more cars that are on the road with us at the same time. If it's just you, the car, and a fresh set of dashed lines doing 70mph all alone, I would take the chance that AP is 99.999999% accurate; more so even then you would be.
 
AP is not very dependable, it happen to me several time already in situation that accident could happen
at one time, with my AP enable, the car in front of me slow down at traffic light, my car still accelatering, then it start beeping but not braking hard, I have to step in.
yesterday, with 7.1 update, AP enable, I signal lane change to the left, there is a car at left upper corner, his car back at about my car front with some overlaping. however, my car still try to make the change which almost hit his car. I quickly step in.

when I in CES last week, I saw the Nvidia auto pilot demo, they mention it use 6 camera, like two wide angle and one narrow angle camera at the front alone
 
There are a bunch of self righteous drivers here.
Everone else's tips ARE great, but when "some" belittle the guy like he's a child it gets a bit much.
TAKE THE OP's POST AS ITS MEANT.
The TACC in his case didn't work and it didn't emergency stop!!.. HEADS UP, it happened to him and it could happen to you.
Geez
 
A car crash is NOT an accident. Close to 99% are due to stupidity on the part of one or more people operating 4000 lb deadly weapons in an irresponsible way.

Each year in the U.S. nearly 5,000 people who are on foot are killed by people driving cars. 92% of these are completely the fault of the driver. Over 700 people riding bicycles are killed by people driving cars each year in the U.S. (about 55% are the fault of the driver, 20% the fault of the bicycle rider, and 25% no fault determined).

Today in the U.S. about 12 people will be killed by people driving cars. Should these drivers not beat themselves up over killing someone? Are we so callous that killing someone or their mother or their child is nothing to be concerned about? These are not accidents.

Still waiting for the information from Tesla concerning the "accident" (and I do consider it an "accident" because I certainly didn't hit the poor schmuck ahead of me on purpose). But please permit me to add a little color commentary to the unfortunate event:

A typical commute to Chicago on I90, visibility good, road conditions optimal. As I described previously (and again, memory is fickle), AP seemed to be functioning perfectly. For whatever reason--a dopamine response perhaps--the sensation of the car driving itself, was bringing me significant pleasure.

I remember that the car I was following switched lanes, and apparently my car picked up the scent of the unfortunate victim. His car gradually slowed, and I gradually slowed, until we were traveling at around 10 MPH. I remember his car stopped, my car slowed but didn't stop, I slammed on the brakes and hit him. He pulled over on the left shoulder of the highway. I tried to motion him to the right shoulder but to no avail.

I got out of my car, looking incredibly handsome and debonair (well, maybe not so much). He got out of his car.

"I'm so sorry," I said. "Are you alright? Totally my fault."

Surveying the damage to his rear bumper, visibly annoyed, he said, "You were texting or something?"

"No, absolutely not. Actually...and you're probably not going to believe this, but it was the car's fault! This is a brand, new car, and I was using this Auto Pilot feature, and it was working perfectly fine, but then it just stopped working."

He looked at me blankly, and seemed to be mulling a response.

"Don't worry," I said. "I take full responsibility and I'll cover all your expenses. I'm so sorry about this."

"I'm calling my cousin," he said, getting back into his car to escape the cold.

"I'm going to warn people about this problem," I muttered to him, a lame defense.

I waited outside his car for a few minutes until he cracked his window. "I'm calling the cops," he said.

I sulked back to my vehicle, trying not to look at the faces of the drivers gaping at the new Tesla that had rammed into an innocent victim. But I couldn't help it, and locked eyes with a guy who looked downright gleeful, the sight of my misfortune apparently triggering a dopamine response of his own.
 
Last edited: