Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Serious Issue with our 2021 Model 3 - Reported to TSB

This site may earn commission on affiliate links.
We've experienced several issues of phantom regenerative braking at all times of day, road type and weather conditions. It's progressively getting worse. These issues are experienced not just with Autopilot engaged but when it's simply on cruise control. The latest software is installed.

Last night, I was driving on a mostly unoccupied 4 lane highway with the cruise control (not autopilot) engaged. Highway was mostly empty. Pulled into the L/H lane to pass a slower moving vehicle. After the passing manoeuvre was completed, I pulled back into the R/H lane. The car started heavy braking and I was thrown forward into the restraints. This was not a case of regenerative braking, this was the car forcefully applying the brakes.

I quickly applied pressure on the accelerator to avoid being rear-ended by the car I had just passed. The cruise control indicator still showed that the cruise control was engaged at a set speed of 115 KPH but letting up on the gas would cause the vehicle to de-accelerate well below the set speed.

Disengaging the cruise control and re-engaging it returned the vehicle to the proper speed setting. However, there were several other events following this one where the cruise control disengaged on its own (but without the active braking, just regenerative braking).

I turned off cruise control for the rest of the drive home. I do not trust this vehicle's automated driving assist programs any longer. Have reported this behaviour to the authorities and will be bringing the vehicle back to Telsa. I can't be the only one concerned about this.
 
We also have a 2021 Model 3 and I only use the cruise control when there are a few cars around. I found that at night the phantom braking is worse.
I wonder if Tesla will address this problem, others are saying the same thing happens to them. Would an update be a fix? I do not know...
 
I've had very similar issues (AP OR cruise control engaged on a relatively empty highway at night) where the car suddenly slams on the brakes for no apparent reason. It's because of this I almost never use AP or cruise control when my family's in the car, and sparingly use it even when it's just me. Tesla really needs to fix this sh!t.
 
A problem with TMC is that they chose to separate the Forums into model types, making it harder to follow systemic problems between Tesla models. Members needing to find broader information need to do their searches across all models.
In the Model-Y section this topic is heavily discussed. They show when the incidents started to get reported and basically the frustration you're showing. Drivers' range between excusing it as modern technology and there will be bugs or it's a deal breaker. It appears not to be just a Tesla problem either. And of course in Tesla style, someone lights a candle in Elon's image and fairy dust downloads to our Teslas and we all live happily ever after.
 
  • Like
Reactions: Horty4
A problem with TMC is that they chose to separate the Forums into model types, making it harder to follow systemic problems between Tesla models. Members needing to find broader information need to do their searches across all models.
In the Model-Y section this topic is heavily discussed. They show when the incidents started to get reported and basically the frustration you're showing. Drivers' range between excusing it as modern technology and there will be bugs or it's a deal breaker. It appears not to be just a Tesla problem either. And of course in Tesla style, someone lights a candle in Elon's image and fairy dust downloads to our Teslas and we all live happily ever after.
There's no excuse for this. Wife and I could be seriously injured from a high speed rear-end collision or an off-road excursion if this happens on snow covered roads. Too bad about the forking of the forums. They should maybe start a forum section just for this issue.
 
I've had very similar issues (AP OR cruise control engaged on a relatively empty highway at night) where the car suddenly slams on the brakes for no apparent reason. It's because of this I almost never use AP or cruise control when my family's in the car, and sparingly use it even when it's just me. Tesla really needs to fix this sh!t.
If you are simply expecting a magical fix, don't.

You need to report to your governmental authority, NTSB in the USA, each time it occurs.
 
I've had very similar issues (AP OR cruise control engaged on a relatively empty highway at night) where the car suddenly slams on the brakes for no apparent reason. It's because of this I almost never use AP or cruise control when my family's in the car, and sparingly use it even when it's just me. Tesla really needs to fix this sh!t.
Yup and like the previous poster mentioned, it's good practice to keep reporting these braking issues to the authorities.
 
  • Like
Reactions: M3BlueGeorgia
A year? It's been happening for longer than that. Yes, they are aware and working on it, enhancing the AI system. It's taking them longer than they anticipated.

Now, about the "on snow" thing... First, using any automated system on snow-covered roads is a bad idea, you should be driving yourself in such situations. I understand you're talking about "Traffic-Aware Cruise Control" but I thought best to specify. It is traffic-aware, it's not a dumb cruise control, so maybe it's not a good tool on snow.

Second, the car ALSO has traction control, ABS, and stability control programs. These would make sure you don't just spin out of control. Phantom braking is a serious issue, but not that serious. Also, getting hit in the back is probably less injury-prone than many other accidents the system is trying to avoid, like head-on collisions. They are false positives, I know, but it's trying to avoid more fatal things.
 
A year? It's been happening for longer than that. Yes, they are aware and working on it, enhancing the AI system. It's taking them longer than they anticipated.

Now, about the "on snow" thing... First, using any automated system on snow-covered roads is a bad idea, you should be driving yourself in such situations. I understand you're talking about "Traffic-Aware Cruise Control" but I thought best to specify. It is traffic-aware, it's not a dumb cruise control, so maybe it's not a good tool on snow.

Second, the car ALSO has traction control, ABS, and stability control programs. These would make sure you don't just spin out of control. Phantom braking is a serious issue, but not that serious. Also, getting hit in the back is probably less injury-prone than many other accidents the system is trying to avoid, like head-on collisions. They are false positives, I know, but it's trying to avoid more fatal things.
I appreciate your approach to this and agree that driver assist programs have a time and place where they shouldn't be used. However I disagree that it's not a serious issue. In any other vehicle, random, unpredictable braking for no apparent reason would be reason for a massive recall.
 
Never said it wasn't serious. and I personally hate it when it happens too, I am eagerly waiting for a good resolution. I said it was not as life-threatening as some might put it. First, getting hit in the back is probably the least dangerous way to get hit. Second, you can override with the accelerator. Third, there are laws that make it so the person hitting you from behind is the one at fault as they are supposed to stay far enough away to react, were you to hit the brakes for whatever reason.
I would also argue that phantom braking is not braking "for no apparent reason". There is an apparent reason, as the AI is seeing that reason and deciding to take action. You might be close to the line, the oncoming vehicle might be in a position where it might cross towards you etc. Every time it happened for me, I could see the potential reason for it. The situation did not need it, hence the false positive alert, but I could understand why it did what it did.

I still want this fixed, and I'm fine with everyone reporting it as an issue, because it is an issue.

EDIT: Ah, this makes me think of Autopilot, and even FSD. You're supposed to stay mindful, with hands on wheel, ready to react at any moment. It's an assist system, not a replacement / fully automated system. I guess TACC is the same, we must constantly pay attention and override when needed. It's a pain, I know, a fully automated and trustable system would be better.
 
Never said it wasn't serious. and I personally hate it when it happens too, I am eagerly waiting for a good resolution. I said it was not as life-threatening as some might put it. First, getting hit in the back is probably the least dangerous way to get hit. Second, you can override with the accelerator. Third, there are laws that make it so the person hitting you from behind is the one at fault as they are supposed to stay far enough away to react, were you to hit the brakes for whatever reason.
I would also argue that phantom braking is not braking "for no apparent reason". There is an apparent reason, as the AI is seeing that reason and deciding to take action. You might be close to the line, the oncoming vehicle might be in a position where it might cross towards you etc. Every time it happened for me, I could see the potential reason for it. The situation did not need it, hence the false positive alert, but I could understand why it did what it did.

I still want this fixed, and I'm fine with everyone reporting it as an issue, because it is an issue.

EDIT: Ah, this makes me think of Autopilot, and even FSD. You're supposed to stay mindful, with hands on wheel, ready to react at any moment. It's an assist system, not a replacement / fully automated system. I guess TACC is the same, we must constantly pay attention and override when needed. It's a pain, I know, a fully automated and trustable system would be better.
Indeed, AI and ML is of yet, no substitute for human capabilities. When I said no "apparent reason", I'm referring to me as a human driver to anticipate and understand why this braking might be happening. The software obviously knows why it's braking (I hope) but that's of little consolation to us fragile bags of bone and water. I'm hopeful that this will be fixed too but the more this gets reported to the authorities, the more likely it is that something will happen.
 
  • Like
Reactions: GtiMart
We've experienced several issues of phantom regenerative braking at all times of day, road type and weather conditions. It's progressively getting worse. These issues are experienced not just with Autopilot engaged but when it's simply on cruise control.

Just being pedantic... this is unrelated to "regenerative" braking. It's just phantom braking, and is likely engaging the disc brakes if the vehicle deems it necessary to brake for an emergency.

Also, regarding "simply on cruise control"... it *is* traffic aware cruise control. This means the car slows down and speeds up to keep up with surrounding traffic, as well as slams on the brakes to avoid collision when necessary.

That's where the problem lies - the car isn't a good driver. It has no real intelligence, only "artificial" intelligence. It's a handful of cameras and sensors run by "beta" software. This isn't a Tesla specific problem. "Self driving" is incredibly difficult. IMO, only Waymo has shown any real commercially viable progress, and they're using LiDAR, had a major headstart, and are still geofencing their vehicles.

I understand your complaints, since Tesla offers the feature (maybe they shouldn't), but my recommendation is to simply not use it. Just drive the car the good old-fashioned way. Of all the cars to drive, a Tesla is one of the funniest.
 
  • Like
Reactions: sleepydoc
Just being pedantic... this is unrelated to "regenerative" braking. It's just phantom braking, and is likely engaging the disc brakes if the vehicle deems it necessary to brake for an emergency.

Also, regarding "simply on cruise control"... it *is* traffic aware cruise control. This means the car slows down and speeds up to keep up with surrounding traffic, as well as slams on the brakes to avoid collision when necessary.

That's where the problem lies - the car isn't a good driver. It has no real intelligence, only "artificial" intelligence. It's a handful of cameras and sensors run by "beta" software. This isn't a Tesla specific problem. "Self driving" is incredibly difficult. IMO, only Waymo has shown any real commercially viable progress, and they're using LiDAR, had a major headstart, and are still geofencing their vehicles.

I understand your complaints, since Tesla offers the feature (maybe they shouldn't), but my recommendation is to simply not use it. Just drive the car the good old-fashioned way. Of all the cars to drive, a Tesla is one of the funniest.
Hard to disagree with the driving experience. It's the best. However I am experiencing what feels like what happens when you take your foot off the accelerator and regen braking engages. I'm used to that feeling and it's exactly the same experience with phantom braking in 90% of the cases. The other 10% of these incidents results in much more aggressive braking. I get though, that the system might be in active braking mode, just with differing amounts of braking force being applied.

Wondering out loud here. Did this start getting worse when Tesla moved from radar to optical sensing?
 
Ah, if the deceleration isn't harsh, you might be experiencing something different.
- in some situations the gps is confused between you driving on the highway and you driving on the side road close by that has a lower speed limit. The car sometimes reduces its speed because of that. I have a specific place in mind close to home where this happens systematically.
- When going up a hill, at the end of the crest when the road goes back flat, sometimes the cameras don't see the road anymore so the car slows down in case the road just plain ends or there is an obstacle that cannot be seen until it's too late. There's a drive I take sometimes that's got plenty of small hills and it happens systematically on almost each of them.
 
Tesla is well aware of this issue, and working like crazy to get it fixed. Hopefully they will have a solution that only requires an OTA software download.
They've known about it for multiple years, and even back in 2019 when I reported it, they said "it's normal". And I'm sure it happened and has been reported well before I got mine in 2019.