Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AutoPilot Almost Killed Me!

This site may earn commission on affiliate links.
If you get rear ended, the fault lies with the person who hit you roughly 99.99999% of the time. It’s called following too closely. Or as they teach you in defensive driving, distance is your friend.

My goal as a driver is no not get into any wrecks - regardless of fault. I don't welcome being rear ended, spending hours sorting out the insurance and whatnot, or having the car in the shop for weeks of repairs.

We purchased Enhanced Autopilot and FSD on the Model 3. Did not purchase on the Model Y. Live and learn.

We trusted Tesla that Autopilot was ready for general use. It is not - not by a long shot. We love the car, but the software isn't ready for widespread use and I'm not going to put my life at risk to be a corporations test dummy.
 
My goal as a driver is no not get into any wrecks - regardless of fault. I don't welcome being rear ended, spending hours sorting out the insurance and whatnot, or having the car in the shop for weeks of repairs.

We purchased Enhanced Autopilot and FSD on the Model 3. Did not purchase on the Model Y. Live and learn.

We trusted Tesla that Autopilot was ready for general use. It is not - not by a long shot. We love the car, but the software isn't ready for widespread use and I'm not going to put my life at risk to be a corporations test dummy.

You certainly have that right and the right to your opinion. I am of the opinion that Autopilot, when properly used, is absolutely ready for general use. And I think for you to suggest you are putting your life at risk by using it is being overly dramatic. Again, just my opinion. To each his own.
 
If you get rear ended, the fault lies with the person who hit you roughly 99.99999% of the time. It’s called following too closely. Or as they teach you in defensive driving, distance is your friend.

I agree. However, my family members who may be riding with me may get badly injured in a sudden, high speed rear end accident. Doesn't matter if the driver behind me is legally responsible.
 
What will be really interesting if the car behind is also a Tesla on AP with following distance set to one car length, if the teslacam recording shows the car in front applied brakes going at 70mph, should be easy to prove in court case that it was car in front that was to blame for the collision
There would (or at least should) not be a collision in that case. If there is, time to sue Tesla for allowing the following distance to be set too short. Though might be tricky as in the end the driver will be responsible.
 
What will be really interesting if the car behind is also a Tesla on AP with following distance set to one car length, if the teslacam recording shows the car in front applied brakes going at 70mph, should be easy to prove in court case that it was car in front that was to blame for the collision

Oh that WILL be interesting. Although IMO it won’t change the outcome, and the court will still find that the person following too closely is responsible. And in this case the court may thank the driver for providing Teslacam footage that proves he was following too closely. Unless it can be proven that the driver of the lead car hit his brakes intending to cause the accident, or that he was driving recklessly, then the following car will still be at fault.

To illustrate let me give another example. Say driver A is driving on the interstate with cruise set to 75 mph and driver B is following behind at a distance of only a few car lengths. Then driver A suddenly sneezes and accidentally hits his brakes momentarily but very hard. As a result driver B rear ends driver A. But driver B has dash cam footage showing driver A sneeze and hit the brakes. The court will almost certainly find driver B at fault because had he been following at the proper distance he could have avoided the collision.
 
  • Like
Reactions: Ciaopec and frankvb
I agree. However, my family members who may be riding with me may get badly injured in a sudden, high speed rear end accident. Doesn't matter if the driver behind me is legally responsible.

You are correct of course and no one wants to see anyone get hurt. I suppose it comes down to one question, is there any significant or measurable increase in the likelihood of such a rear end collision (or any other accident) while using autopilot vs not using autopilot? If you believe the statistics published by Tesla the answer is no, and in fact the truth is the exact opposite (that your odds of an accident are reduced while using autopilot). As someone who has driven hundreds of thousands of miles before autopilot and another 91k miles with autopilot, my opinion is that I and my family are at least as safe and likely safer using autopilot. (I should note that I use autopilot as intended; IOW I pay attention and do not nap or read or check email or engage in other dangerous behaviors while using autopilot)
 
If the car in front of you had inoperative brake lights you are still liable if they stop and you smash them.

I am a huge fan of TACC convenience but in all honesty I don’t use it as often as I did when I first got my Subaru. It is too cautious and it results in people cutting in front of me. It greatly exaggerates the distance and breaks hard it situations that the first lead car is turning right into a shopping mall or something. It waits until the lead car is completely and entirely out of view before it closes the 2 or 3 or 4 car distance you have set. The human driver and anticipate this and edge closer even if at slow speeds.

I like the direction we are headed in regardless. You choose to be a part of it or you don’t. The Tesla is great with automation but I don’t mind being the driver in most situations.
 
Anyone know if Elon has ever discussed adding normal cruise control? I’m sure plenty of owners would appreciate it to avoid the phantom braking.

I occasionally get phantom braking on TACC in my 2016 MS. Happens (though not frequently and not recently) at night on a particular section of two lane road near my home. Still trying to determine trigger.
Or did you mean totally stupid cruise with no radar?
 
I agree with OP. Two weeks ago I had experienced this behavior on an empty freeway with no one in front or behind me (thankfully) but if I had been followed by a large vehicle things could have easily escalated to severe.

Add to that Tesla’s point that it’s the driver’s responsibility when these systems are engaged. At the end of the day, who’s the insurance company going to place at fault because your car randomly brake checked someone on a smoothly running freeway?

I am very leery of TACC after this experience. If I can’t use TACC with confidence (or autopilot) I’m surely not going to pay to beta test FSD and possibly be at fault because I couldn’t react to my car being stupid in enough time. If tesla were smart there’d offer FSD for a nominal fee or free. They’d get more data, which is what they rely on to make FSD work.

The fact that fandom braking has been a long-standing issue makes me believe FSD wont come until well after our cars are obsolete.
Exactly! So on my Cybertruck reservation I did not select FSD. Not worth the money considering all the flaws. Perhaps in the future.
 
In 2.5 years and 18,000 miles I never, ever experienced a phantom braking event on the freeway in my Outback with Subaru’s Eyesight system. That’s a camera-only system too; some how little ol’ Subaru has figured it out.

Tesla is supposed to be on the cusp of FSD but they can’t figure out how to solve freeway phantom braking after all these years? Stop it.

Did it ever brake for you in an emergency?
 
...This never happens in any car with adaptive cruise control...
I have driven several cars with adaptive cruise control. Every one of them had some cases of phantom braking. FWIW, BMW, Mercedes, Jaguar and Lexus are the ones I remember. Some do more than others, it's always irritating and sometimes potentially dangerous. The invariable rule of all Autopilots and assisted driving/flying: machines are not perfect. Driver/Pilot MUST be in control. I've had a few really irritating and potentially dangerous situations, mostly while flying but some while driving. Tesla has not been worse than others but has been different. Every one I have used has had flaws, but each one is different.

It isn't the fault of the car/airplane. it's always the result of inattention or impairment for the driver/pilot.

When Level 5 arrives it will be different.As technology improves failures become rare, but they still happen.

Sorry for seeming to lecture. You're blaming the car for your inattention and stating that Tesla is the only one with that issue. Not true.
 
Love my Model Y, but damn, people really crucify you for saying anything bad about Tesla. Maybe the OP is not overreacting. We weren't there. If the car behind you has to swerve to avoid hitting you, that's pretty bad.

We are in a Tesla forums, so ignore me.
I hope I don't do that. It does seem that many of the reported issues have happened with people who are not experienced with this type of technology. I wish driver training, similar to an aircraft type rating, were required to drive Tesla. It is not an automatic transition and ignorance produces accidents. Tesla does not help because there is no real driver training given at purchase or afterwards. At minimum there should be something well beyond a handful of cursory videos.

I am biased because I have nearly killed myself and others when I made mistakes while flying. Others when the equipment failed. The latter are easier to deal with than the former.

This goes back to 'sudden acceleration' errors decades ago with Audis and others. Technically the driver was at fault, redesign reduced the risk of that error. Tesla should do a better job of training and probably some ergonomic improvements too. Reducing the probability of phantom baking is one of those things.
 
  • Like
Reactions: DanDi58
What will be really interesting if the car behind is also a Tesla on AP with following distance set to one car length...
Especially interesting because Tesla cannot set following in distance (i.e. car lengths) but in time, since the distance is set to be speed sensitive. That information is in the manual, but few actually read it. That is another good reason why there should be mandatory driver training when using highly complex vehicles of any type.

One second is nearly always lots more than one car length.
 
There would (or at least should) not be a collision in that case. If there is, time to sue Tesla for allowing the following distance to be set too short. Though might be tricky as in the end the driver will be responsible.
There would be reason if one could set following to one car length. That would be illegal is many places. many jurisdictions have questions about following distance on initial driver examination. Generally they tend to be "one car length per ten miles of speed' or something like that. That tends to equate generally to around one second difference.
 
I have an M3 and experience phantom braking. I've found that it happens when I am driving downhill approaching an overpass. My assumption is that the radar "sees" the overpass far in the distance because it is in-line with the car due to the car being uphill from the overpass. In combination with the shadow being identified by the visual camera the AI effectively sees a wall or stationary object and brakes accordingly. It is very annoying.

As of this most recent update, however, the problem has gone away. I drive the same patch of highway about 3 times a week and no more phantom braking. There was an update a few months ago where it would gently brake...that was nicer. Then it reverted back to hard braking...not nice. And now it works perfectly...very nice.

I often wonder to myself about what safety sacrifice they had to make in order to stop the phantom braking. Regardless of what it was, it was worth it because I hate jamming on the brakes at 70 mph!
 
If you get rear ended, the fault lies with the person who hit you roughly 99.99999% of the time. It’s called following too closely. Or as they teach you in defensive driving, distance is your friend.

Clearly you've never been brake checked .. and yes, ideally you can be at a safe distance, but there are times when people move over on you and then instantly brake check you, before you have time to ease back to a safe distance.