Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Another rear end accident on AP

This site may earn commission on affiliate links.
"One of the common misimpressions is that when there is, say, a serious accident on Autopilot, people – or some of the articles – for some reason think that it’s because the driver thought the car was fully autonomous and it wasn’t, and we somehow misled them into thinking it was fully autonomous. It is the opposite.

When there is a serious accident, it’s in fact almost always, maybe always, the case that it is an experienced user and the issue is more one of complacency. They get too used to it. That tends to be more of an issue. It is not a lack of understanding of what Autopilot can do. It’s actually thinking they know more about Autopilot than they do, like quite a significant understanding of it."
Elon Musk

How is this even an excuse unless you literally wrecked your first 30 seconds of enabling AP?

This is just one representative scenario:

Red light ahead
Cars stopped
Road Limit 50MPH
Tesla moving at 55MPH
AP is engaged
Calculated stopping distance is 350 feet

400 feet away, AP has not initiated slowdown.

What to do?!?!!?!

TAP THE BRAKE AND DISENGAGE AP FFS.

Monitor if regen can bring car to a complete stop or assist with friction brakes.
 
How is this even an excuse unless you literally wrecked your first 30 seconds of enabling AP?
I think you're misinterpreting my point. This was in response to all the autopilot "experts" who think that such an accident couldn't possibly happen on autopilot because of their personal experience (which btw is at most a few hundred thousand miles, nowhere near enough to prove the safety of anything). They're also ignoring the fact that the manual explicitly lists responding to stopped vehicles as a weak point in the system.
 
  • Like
Reactions: MXWing
I think you're misinterpreting my point. This was in response to all the autopilot "experts" who think that such an accident couldn't possibly happen on autopilot because of their personal experience (which btw is at most a few hundred thousand miles, nowhere near enough to prove the safety of anything). They're also ignoring the fact that the manual explicitly lists responding to stopped vehicles as a weak point in the system.

Fair enough. Not everything WILL happen, but almost anything CAN happen.

Tweaks to AP is a constant and there is no detailed log on what is tweaked.

1 Million miles of AP experience doesn't mean anything when version 2021.36.4.2 introduces a bug that treat red lights as green lights (possible hypothetical).

Until Tesla assumes 100% liability for you, the car and 3rd parties - its our responsibility to be vigilant always.
 
  • Like
Reactions: Daniel in SD
Glad that I’m not alone.
You guys are correct it says radar in our car but the intent is the same. I believe the radar should give closure rate something much harder to do with cameras I think and it should not be susceptible to darkness or most whether conditions. Don't get me wrong guys I use the hell out of all the automation and for me its errors are generally on the conservative side like the sometimes tentative lane changes in NOA. These headlines just catch my attention.
 
You guys are correct it says radar in our car but the intent is the same. I believe the radar should give closure rate something much harder to do with cameras I think and it should not be susceptible to darkness or most whether conditions. Don't get me wrong guys I use the hell out of all the automation and for me its errors are generally on the conservative side like the sometimes tentative lane changes in NOA. These headlines just catch my attention.
The problem with stopped vehicles is that the closure rate is exactly the same as everything else that's not moving (i.e. the road, overpasses and things close to the lane like parked cars, trees, signs, etc.). It doesn't have enough resolution to identify stopped vehicles with 100% reliability.
 
  • Informative
Reactions: AlanSubie4Life
I have a feeling I will not use autopilot much. I'm only speaking for myself, but if autopilot is primarily doing the driving, it will be hard to maintain my own alertness and attentiveness indefinitely, especially as you become more "trusting" of the autopilot. It seems like it would be easy to get too comfortable. The other scenario would be me being paranoid/stressed enough about the autopilot making an error that I would prefer to manually operate the car.
I hear that. Late one night I was traveling from the San Francisco Bay Area to Los Angeles on I-5. Autopilot was on and I was getting drowsy. I shut it off so I would have to drive, and thus (hopefully) overcome my drowsiness. Made a couple stops for Supercharging. And coffee. :)
 
eww the infamous disclaimer.. If that's really the case then TAAC is utterly unreliable and truly dangerous. TAAC is hardly new tech many cars offer similar options and I've owned, do own some of them. I've never had an issue with any of them including the Tesla and the Tesla has arguably better tech then the others, with 3 forward facing cameras and lidar. Its supposed to destress the drive but that would hardly be the case if you can't trust it not to plow into the car in front of you. My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.
I'd like someone to show the User Manual from another vehicle that states something to the effect of: "You can bet your life on our emergency braking systems reliability. Every time."
 
...little incite...

Please see my answer from the other thread:

Two reports of Teslas on AP hitting stopped vehicles in their lane on the freeway

...not just random...

As the manual's pointing out on post #3 above, Autopilot collisions, injuries, and deaths are predictable and not random if drivers don't know how to override the system timely.

Autopilot is not perfected just yet and it is up to the driver to accept that limitation and get involved in avoiding a collision.

We are still waiting for Tesla Robotaxis so that drivers don't need to be in the car but in the meantime, human driving skill is still needed.
 
sometimes cars will be stopped at a light up ahead and I'll be running 60 mph and the car always comes to a nice orderly stop.

My car seems to be quite reliable but I'd sure like to know if there are some kind of mitigating circumstances that I should be aware of.

I am sure that is for legal purposes.

If I were a betting man this will vanish from the news after Tesla shows that the operators foot was on the accelerator (overriding AP).

really you hit the steering wheel when trying to pet your dog in the back seat.

Stop detection does work.


The fact that people are confused about how this could happen with AP is actually the most likely explanation for why it happened.

AP does not reliably stop for stopped traffic! It is that simple. (To be clear, it also does not reliably avoid collisions with moving objects.). That’s why AP is not doing the driving - you are.
 
Last edited:
From my experience, stopped cars are ALWAYS detected even in city driving.

Again, the confusion here about how if it has "always" happened for you, that means it extends and ensures that it will always happen in general, is the most likely explanation for this accident.

Most likely, in this accident the police car was stopped kind of between lanes or was just some potion of it was located in tesla's lane.

This may well have been the case here, but while it may well be a sufficient condition for failure, it is not a necessary one.

For the record, the police cruiser in the pictures was hit generally on the right hand side rear of the vehicle. But it was not an extremely high offset collision. It looks to me like the vehicle was probably mostly in the lane, though it may well have been slightly over the left hand line. It may well have been parked at an angle, for safety reasons.

78761407_2559517384277597_2459412575654248448_o.jpg
 
Last edited:
I'm only speaking for myself, but if autopilot is primarily doing the driving, it will be hard to maintain my own alertness and attentiveness indefinitely, especially as you become more "trusting" of the autopilot. It seems like it would be easy to get too comfortable.
My experience is different over two years and tens of thousands of miles. Autopilot handles the scut-work so I can focus on the long and wide views. This is far less tiring and safer.

You are correct - it is foolish to put blind faith in an autopilot in any context. I worked at an airline earlier in my career. One disaster occurred when the pilot & copilot waiting for landing clearance relied on autopilot to fly the holding pattern while they flirted with a flight attendant.

That was fine for a while. Then ice built up to the point the autopilot couldn’t offset it. The aircraft fell from the sky.

Still, there are thousands of safe flights every day. Those pilots use autopilot and maintain situational awareness. Do the same in your Tesla.
 
  • Like
Reactions: DopeGhoti