Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot False Positive Braking

This site may earn commission on affiliate links.
Up until yesterday I've read 'Autopilot caused accident' threads and news with a grain of salt. I always thought they were Tesla shorters or people who were not quite well educated on Autopilot and its capabilities.

However yesterday when I was cruising at stop&go traffic on Autopilot 1 with an 85D, the car suddenly slammed the brakes when it had just started accelarating due to traffic ahead starting to move. It had no reason to brake; no one cutting it off, no stationary objects, it wasn't a turn either it was straight. Due to it slamming the brakes like an emergency stop, a minivan changing lanes behind me, trying to get in to where I was leaving, dented to right rear end of the car. How could I be supervising Autopilot in this incident? As soon as I noticed it was slamming on the brakes I stomped on the accelarator; that's how I got away with just a dent instead of a full on bumper/fender replacement.

I understand it not being able to take a turn right or not dedecting a stationary object on time. That is why we are still responsible on Autopilot. However slamming the brakes hard in stop&go traffic, reducing my reaction time to correct Autopilot to virtually zero is something I haven't read anyone report before.

What do you think? Am I missing something?
 
If you were accelerating when traffic ahead was stopped and then started to move, from what I've been reading I think it suddenly saw there was traffic ahead and took action so you didn't slam into it. It doesn't register non moving objects, or am I wrong in thinking that.
 
as long we don't have perfect AI/software that can understand radar, and camera data fully, there will always be a balance between having false positives or situations AP do not respond to.
I have never experienced what you describe myself.
 
If you were accelerating when traffic ahead was stopped and then started to move, from what I've been reading I think it suddenly saw there was traffic ahead and took action so you didn't slam into it. It doesn't register non moving objects, or am I wrong in thinking that.
If it didn’t register stationary objects Autopilot would not brake and take the car to zero when traffic is at a standstill on the highway.
 
That wasn't AEB that activated.

It was a false braking event with AP.

In my experience with TACC I've had minor false braking events that luckily have happened at speed where I was able to correct so I only looked a bit ridiculous. I still use it because the convenience is worth the rare risk.

Plus this type of event is not limited to Tesla.

An AP1 car is built using MobileEye technology and that's on a lot of cars. My 70D has this, and Mobileeye is one of the most proven systems. But, it's not foolproof.

In your case it really came down to bad luck. The same thing would have happened had you accidently slipped a clutch on an old car while taking off.

Insurance wise I believe it's the responsibility of the person behind you not to hit you. That minivan driver has no idea what you were stopping for, but they still need to give enough room to stop safely.
 
The software appears to be overly conservative - initiating braking when it believes there MAY be an object in front.

We've got a stretch of surface street near our house that has a patched section of pavement with a different shade - and everytime we drive over that section with TACC enabled, the car slows down.

Also notice some times on highways when the software will briefly start slowing down, preparing to do a harder brake - and then decides it's OK - and resumes the set speed.

This was much worse last summer - with unnecessary braking at times when driving under bridges or large road signs - that's improved in recent releases.

As for detecting fixed objects ahead - like a large fire truck. If Tesla really believes AP2.x sensors are enough to support FSD, then the problem with detecting a fire truck is a short term issue with the current software, and eventually the software will be improved enough to recognize a fixed object in the lane ahead - and take action in time before hitting the object.

Until Tesla has confirmed they have AP2 working correctly, drivers need to be careful when operating TACC and AP2/AutoSteer - the software will not always operate as expected.
 
Not all stop-and-go traffic are alike. There's the more typical go 15-25 mph for a short distance and then stop, rinse, repeat. And then there's what I'd consider a trickier one at a higher speed (typically around twice the aforementioned). Not sure which one you encountered here. You also mentioned that the offending car made some lane switch, which in a stop-and-go implies a rather aggressive driver. Regardless of the scenario, if one simply keeps a safe distance with the car in front of them, I believe they should in turn be able to brake their car safely despite what the other car has and does the overwhelming majority of the time (other than some extreme cases). Unfortunately, I think there's a lot of bad drivers out there who don't know how to do so. How I try to overcome this is simply by not using AP if I observe certain traffic conditions or such a driver behind me, or that I would take over braking by doing it early and gently.

I think we all know that AP1/2 has their limitations, but certain ones (such as any phantom braking) are even more problematic than one may realize when we factor in the other conditions out there (e.g. bad drivers). Avoiding accidents is already hard enough when we're the ones fully driving, let alone a computer with the said limitations, and the more reason why we need to be judicious when and where to use it. If you are already judicious in your usage of AP, then this incident may come down to you just encountering a bad driver or that it's one of those extreme cases. At least your Tesla came off with only a relatively minor damage and that you seemed to be fine, which is the most important thing.
 
  • Like
Reactions: emir-t
I've had these false braking events too. It shouldn't be caused by the radar, because I never see this while on TACC. Only on autosteer does this problem occur.

It's not a stop-and-go traffic only problem. It gives you quite a jolt when this happens on the highway. Definitely not safe.

(I should note that I do love autopilot, and use it on all of my longer drives. But I'm not blind to its flaws. Hopefully talking about this more will increase awareness of these problems, get them fixed sooner, and make everyone a safer driver/user.)
 
  • Like
Reactions: JHawk and Swift
The software appears to be overly conservative - initiating braking when it believes there MAY be an object in front.

But... beware of the fire truck exception to this rule. Teslas love fire trucks.

screen-shot-2018-01-22-at-5-01-04-pm-e1516658621655.jpg
 
You also mentioned that the offending car made some lane switch, which in a stop-and-go implies a rather aggressive driver. Regardless of the scenario, if one simply keeps a safe distance with the car in front of them, I believe they should in turn be able to brake their car safely despite what the other car has and does the overwhelming majority of the time (other than some extreme cases). Unfortunately, I think there's a lot of bad drivers out there who don't know how to do so. How I try to overcome this is simply by not using AP if I observe certain traffic conditions or such a driver behind me, or that I would take over braking by doing it early and gently.
Let’s not be so quick to blame this on the other driver. While legally they are likely at fault, in the real world I don’t think they could have anticipated the Tesla’s actions here. They saw the Tesla accelerating with clear space ahead (I say this because the OP said he mashed the accelerator to remediate and he wouldn’t have done so if there were vehicles ahead), and began to move into the now open lane. Who here honestly wouldn’t have done that? I know if I mashed my brakes in any kind of urban traffic setting for no reason, 9 times out of 10, I’m going to get rear-ended.

The other tragedy here is that the driver of the minivan may also be a victim of Tesla’s never-ending AP beta, despite not signing up for it.
 
  • Like
Reactions: emir-t and NerdUno
Because Tesla hasn't officially confirmed what EAP features are actually working - we are all guessing.

It's possible EAP is only operating with the front camera(s) and radar today - and that, at best, it's only looking at the moving vehicle(s) immediately in front and basing decisions on the changing spacing between vehicles.

Human drivers use more information to make driving decisions - such as emergency vehicle sirens and brake lights - inputs that are probably not being used by AP2 software. A human driver will see traffic slowing down ahead - with brake lights from many vehicles, and be prepared to slow down and stop. If AP2 is just looking at the vehicles immediately ahead and ignoring the brake lights, the software is likely to react later than a human driver.

From a safety perspective, it would help considerably if Tesla could tell us what is and isn't working in EAP - so the driver is prepared to handle situations not currently supported in the software.
 
  • Like
  • Helpful
Reactions: cwerdna and Swift
I have experienced many TACC/AEB slow-downs/alerts that I would consider false positives. In most situations, the false positive was understandable such as when the road curves and the car confuses lane occupancy/position of the cars ahead.

At least once, the car locked onto (utility/construction?) markings on a gently-rising road, and warned of collision - warning came on, then the locked image flashed off, and then came back on again, to be turned off again as the car approached the markings (flatter against the road because the road's slope had tapered off). Of course I was startled at the time. But I am not surprised by such limitations, knowing how evolution has fine-tuned our hardware/software capabilities to handle such situations.

Computer systems are still quite primitive - but also improving fast. Hopefully, they will surpass us soon.
 
  • Like
Reactions: PrGrPa
Let’s not be so quick to blame this on the other driver. While legally they are likely at fault, in the real world I don’t think they could have anticipated the Tesla’s actions here. They saw the Tesla accelerating with clear space ahead (I say this because the OP said he mashed the accelerator to remediate and he wouldn’t have done so if there were vehicles ahead), and began to move into the now open lane. Who here honestly wouldn’t have done that? I know if I mashed my brakes in any kind of urban traffic setting for no reason, 9 times out of 10, I’m going to get rear-ended.

The other tragedy here is that the driver of the minivan may also be a victim of Tesla’s never-ending AP beta, despite not signing up for it.

You should leave enough space getting in behind someone that if they suddenly came to a complete stop, you’d be able to stop in time without hitting them. If you want to take a short cut and get in right behind someone like this, then you accept the risk presented by doing so. The fact that lots of other people might short cut the process also doesn’t absolve you of responsibility.
 
  • Like
Reactions: GlmnAlyAirCar
I have experienced many TACC/AEB slow-downs/alerts that I would consider false positives. In most situations, the false positive was understandable such as when the road curves and the car confuses lane occupancy/position of the cars ahead.

At least once, the car locked onto (utility/construction?) markings on a gently-rising road, and warned of collision - warning came on, then the locked image flashed off, and then came back on again, to be turned off again as the car approached the markings (flatter against the road because the road's slope had tapered off). Of course I was startled at the time. But I am not surprised by such limitations, knowing how evolution has fine-tuned our hardware/software capabilities to handle such situations.

Computer systems are still quite primitive - but also improving fast. Hopefully, they will surpass us soon.

I’m still baffled by the occasional(once every few months) chiming of the early collision avoidance *sound* with no visibile alert or reason for going off.
 
  • Like
Reactions: Pentium2004
You should leave enough space getting in behind someone that if they suddenly came to a complete stop, you’d be able to stop in time without hitting them. If you want to take a short cut and get in right behind someone like this, then you accept the risk presented by doing so. The fact that lots of other people might short cut the process also doesn’t absolve you of responsibility.
Sounds good in the textbook. I’m pretty sure if I stand in my brakes for no apparent reason in urban traffic, I can get the car behind me to hit me 9 out of 10 times.