Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another rear end accident on AP

This site may earn commission on affiliate links.
sure would like to get a little incite from Tesla on this. I've had a few cars including the M3 that basically have TAAC and in all of them including the M3 it seems quite reliable. I drive a 60 mph road regularly that has stop lights, so sometimes cars will be stopped at a light up ahead and I'll be running 60 mph and the car always comes to a nice orderly stop. Having both cameras and lidar up front I've never even worried about it.

I would really like to understand the circumstances around these accidents and better understand causation. All we seem to get is the headline. I would feel better if they could point to a hardware failure or something thats not just random.
 
I am only a reservation holder, but as I remember it, during my test drive when I applied force to the Accelerator, the Autopilot/autosteer/TACC did not disengage, but I received a warning that AP/AEB would not engage while I had my foot on the accelerator.

In that instance, you would still be on "autopilot" but would plow into anyone who was stopped in front of you, correct?
 
  • Like
Reactions: Mrcarcrazy
3B3E768A-7BF3-45EC-B954-BB6F927DFD2F.jpeg
 
  • Informative
Reactions: RedModel3
I am only a reservation holder, but as I remember it, during my test drive when I applied force to the Accelerator, the Autopilot/autosteer/TACC did not disengage, but I received a warning that AP/AEB would not engage while I had my foot on the accelerator.

In that instance, you would still be on "autopilot" but would plow into anyone who was stopped in front of you, correct?

I'm hoping that something like that is what happened I would just like to know what the investigation turns up. I'm pretty sure our computer would tell the whole story.
 

eww the infamous disclaimer.. If that's really the case then TAAC is utterly unreliable and truly dangerous. TAAC is hardly new tech many cars offer similar options and I've owned, do own some of them. I've never had an issue with any of them including the Tesla and the Tesla has arguably better tech then the others, with 3 forward facing cameras and lidar. Its supposed to destress the drive but that would hardly be the case if you can't trust it not to plow into the car in front of you. My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.
 
  • Disagree
Reactions: mikes_fsd
eww the infamous disclaimer.. If that's really the case then TAAC is utterly unreliable and truly dangerous. TAAC is hardly new tech many cars offer similar options and I've owned, do own some of them. I've never had an issue with any of them including the Tesla and the Tesla has arguably better tech then the others, with 3 forward facing cameras and lidar. Its supposed to destress the drive but that would hardly be the case if you can't trust it not to plow into the car in front of you. My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.
You keep referring to LIDAR. I'm pretty certain you mean RADAR. Every Model 3 has a RADAR unit. On the other hand, Elon says that LIDAR is a crutch and he will never allow it on his cars (or words to that effect).

RADAR: RAdio Detection And Ranging
LIDAR: LIght Detection And Ranging
 
eww the infamous disclaimer.. If that's really the case then TAAC is utterly unreliable and truly dangerous. TAAC is hardly new tech many cars offer similar options and I've owned, do own some of them. I've never had an issue with any of them including the Tesla and the Tesla has arguably better tech then the others, with 3 forward facing cameras and lidar. Its supposed to destress the drive but that would hardly be the case if you can't trust it not to plow into the car in front of you. My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.

I am sure that is for legal purposes. I just did a short road trip with AutoPilot and it did great even in several stop and go situations. With that being said, I was always aware, alert and ready to slam on the brakes. The car brakes harder than I prefer, but that is because it keeps a set distance, where as in coming up on stopped traffic, I quickly brake hard to alert the car behind me and then coast closely to the car in front of me.
 
  • Like
Reactions: Mrcarcrazy
If I were a betting man this will vanish from the news after Tesla shows that the operators foot was on the accelerator (overriding AP).

I still try my best to refer back to the famous Abe Lincoln quote “don’t always believe what you read on the internet, most of it is nonsense”.

It’s pretty evident that “fake news” is a real thing as every news agency seeks a bigger headline to win those advertiser dollars, at times not doing their due diligence to make sure what they are reporting is accurate. A mix of incompetence and bias is a dangerous thing.

just my .02
 
Last edited:
If I were a betting man this will vanish from the news after Tesla shows that the operators foot was on the accelerator (overriding AP).

I still try my best to refer back to the famous Abe Lincoln quote “don’t always believe what you read on the internet, most of it is nonsense”.

It’s pretty evident that “fake news” is a real thing as every news agency seeks a bigger headline to win those advertiser dollars, at times not doing their due diligence to make sure what they are reporting is accurate. A mix of incompetence and bias is a dangerous thing.

just my .02

It's pretty easy to say "AutoPilot failed" when really you hit the steering wheel when trying to pet your dog in the back seat. Elon probably has a video of that, we all know that camera is really active.
 
I have a feeling I will not use autopilot much. I'm only speaking for myself, but if autopilot is primarily doing the driving, it will be hard to maintain my own alertness and attentiveness indefinitely, especially as you become more "trusting" of the autopilot. It seems like it would be easy to get too comfortable. The other scenario would be me being paranoid/stressed enough about the autopilot making an error that I would prefer to manually operate the car.
 
  • Like
Reactions: MichaelP90DL
Until we get to the point where cars are L5 Autonomous and there is no manual override (i. e. no pedals and wheel), the car will always defer to user input. If you stand on the accelerator while there is a brick wall directly in front of you, the car is designed to presume that you know what you're doing, and let you drive right into it. This presumption may in many cases be objectively wrong (see: all the "unintended sudden acceleration" incidents where the telemetry later shows the panicked Mk. I Meatbag in the chair was commanding full acceleration with their foot on the long pedal on the right).

PICNIC (abbrev.): "Problem In Chair; Not In Computer"
 
It's all about how the car is positioned usually, if it's in the left stopped sometime it doesn't get detected ( and of course most of the problem you see in the news are about a firetruck, a van stopped halfway in the road, a police car etc ), if it's not in front of you but stopped in the left/right pay attenction
usually it works great, but again, it's all ok if you just watch and be ready, if the car doesn't slow down you have plenty of time to correct
 
eww the infamous disclaimer.. If that's really the case then TAAC is utterly unreliable and truly dangerous. TAAC is hardly new tech many cars offer similar options and I've owned, do own some of them. I've never had an issue with any of them including the Tesla and the Tesla has arguably better tech then the others, with 3 forward facing cameras and lidar. Its supposed to destress the drive but that would hardly be the case if you can't trust it not to plow into the car in front of you. My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.
The mitigating circumstances are right in the disclaimer! It’s very hard for Radar based cruise control systems to recognize stopped cars. I guarantee that every other car you’ve owned has the exact same limitation. Just because your car recognizes stopped cars 999 times in a row does not mean it will do so the 1000th time.
 
sure would like to get a little incite from Tesla on this. I've had a few cars including the M3 that basically have TAAC and in all of them including the M3 it seems quite reliable. I drive a 60 mph road regularly that has stop lights, so sometimes cars will be stopped at a light up ahead and I'll be running 60 mph and the car always comes to a nice orderly stop. Having both cameras and lidar up front I've never even worried about it.

I would really like to understand the circumstances around these accidents and better understand causation. All we seem to get is the headline. I would feel better if they could point to a hardware failure or something thats not just random.
Just picking nits here, but I believe you meant to say "insight", and there is no LIDAR in your car.
 
  • Like
Reactions: Mrcarcrazy
I am only a reservation holder, but as I remember it, during my test drive when I applied force to the Accelerator, the Autopilot/autosteer/TACC did not disengage, but I received a warning that AP/AEB would not engage while I had my foot on the accelerator.

In that instance, you would still be on "autopilot" but would plow into anyone who was stopped in front of you, correct?

If you are only a reservation holder and went on one test drive, maybe you should "disagree" less and listen more to those who have years of experience.

Here is everything anyone needs to know about "not being on the news in your Tesla".

* YOU are the insured party not Tesla, not anyone else for a 5,500lb vehicle that can exceed speeds over 120MPH.
* 5,500lb vehicles at any speed poses threat to life, limb and property.
* Humans are better than AP. Humans + AP is better than Human only and AP only.
* Always be aware of the roadway 10-15 seconds ahead.
* Always allow a space cushion in case of unpredictable events.
* AP has to be managed. Not micromanaged necessarily, but managed.
 
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."
Elon Musk