Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Overly Dramatic Braking

This site may earn commission on affiliate links.
Since taking delivery of my M3 last month, I've had it brake automatically a couple of times when, in my view, the force was over the top. I don't mean 'phantom braking' because I could see the reason why - I just don't appreciate the ferocity. Today, I was on cruise control at 50mph and a truck crossed over the road in front of me at least 150m away. Now I could see that while he was a bit late in his manoeuvre, he would be well clear by the time I reached the junction. The M3 on the other hand was having none of it. It didn't quite slam on the brakes but not far from it and certainly enough to irritate the driver behind me....it just looked and felt an exaggerated move for no reason. The distance setting on my car is medium so I don't think it's that causing the issue. Probably 20 minutes later still on cruise control, the M3 again braked hard, apparently because it thought a large truck coming the other way was going to swerve into me. We were well apart and I certainly saw no imminent danger. However again, the driver behind me ( a different one thankfully :)) must have thought I was brake testing him... it just looked and felt stupid. Now, unless i'ts just my car doing this, it seems to me that the auto braking is, in some instances 'all in' whereas it actually needs to be more progressive. I can understand the difficulty in programming that but until it's available, it's less stressful for me to just drive without any aids turned on. Of course, it may just be my car...
 
If we're honest with ourselves these features just aren't very good currently. There's nothing more to it.

That's really the reason I asked the question whether it's just my car because looking at Youtube for instance, the impression is everything is pretty much on track for FSD anytime soon. I have to believe everything will improve but I think the timescale is longer than many would have us believe. To take Masklin's point, ten years ago I had an XKR with auto cruise control and that was considerably better than the current one in my Tesla. Having said all that, the car has many positives and is a pleasure to drive - manually.
 
TACC tries to be too clever and as a result, there are a number of scenarios where it doesn't perform as well as simpler cruise or adaptive cruise control. You do get to know the scenarios where it works well, or doesn't and learn to drive defensively. It is worth revisiting the manual once in a while (and to toggle off/on the feature on screen to enable the functionality to see the warnings) to see how things change from time to time and to put what it says in the manual in the context of your experiences.

It however sounds like at least one of your scenarios was likely to be outside the recommended use case but its all too easy to take prior experience from other systems and think that TACC is going to perform equally well in all those cases. If you followed Tesla's guidance to the letter, its a motorway and divided carriageway A roads function in the dry. The reality is better than that, mostly.

Unfortunately, there are a number of things that Tesla does not do well where other vehicles have managed for years. Tesla's aim is full self drive, so more basic (yet reliable) features seem not to be a consideration to them, even though the route to FSD is taking much longer than they originally promised.

That said, do not confuse the potential of FSD with its current behaviour on urban and windy roads. The only part of FSD that is currently implemented to near its full potential in UK is NoA - driving on city streets and windy roads is still an upcoming FSD feature, yet it still manages it with a fair amount of success on regular AP. I think those capabilities (and a large number of videos showing it being used in those scenarios) lull people into the false impression that it should be able to do it perfectly now, when reality is that its still not officially there yet and in some cases, still a very long way off.

A Tesla is a constantly evolving car, which, unlike most (all?) mainstream cars, means that it has the unique ability to get better, or worse, over time. For some of us we get enjoyment to see what things change on a month by month basis, but I can see for others, they just want a car that they can get in and drive and not to worry about how it is going to misbehave on a particular trip.
 
  • Like
Reactions: Obliter8
In the Elysian future world of FSD I guess we will usually still have to tell the car where we want to go. Presumably the preferred method of doing this would be voice. Perhaps getting this working would be a good next step for Tesla. I only have 3 navigation favourites programmed at present, one of which is “Mum’s”. Shouldn’t be too hard to offer this as a first choice destination rather than “Mom’s cookhouse” in Southern Spain or similar - should it? Phone doesn’t seem to prefer to offer ‘favourites’ either.
 
The problem is that humans may fail and an AP of a car may not.

When the truck is still in your lane and decides to slam the brakes and your AP was assuming he would continue leaving your lane there will be big headlines in the news regarding a failing Tesla AP.

Therefor the AP makes decisions on existing facts and not on assumptions.

This is because we all know what they say about assumptions.
 
  • Like
Reactions: drtimhill
Therefor the AP makes decisions on existing facts and not on assumptions.

True of Tesla's current way, but unfortunately no sign that its tracking/predicting certain scenarios unlike some other autonomous solutions do. So a car 150 yards away crossing the carriage way will not trigger a hard brake as a human and other solutions will have worked out that by the time the car gets to that point, the danger will have passed and if it wasn't going to pass, the car would have had 150 yards to continuously reassess the situation and apply the brakes to bring the car safely to a stop.
 
Braked for a really scary shadow in the road today. It was cast by a tree that was properly on it’s side of the edge of carriageway line.
About a mile further on, the L/H pillar camera reported that it was dirty or blinded. Definitely not the former! Wrong sort of sun on the camera!
 
As a prospective owner, can I ask, is it only when Autopilot mode is engaged that the car exhibits these quirky behaviour traits?

Yes and no.

Most phantom braking events occur when on TACC or AP.

But the car does have other safety features (as do many other cars on the road), such as emergency brake assist or lane departure avoidance where the car can also take what it thinks is evasive action or 'scream at you'. Some of these can either be turned off to some degree, or adjusted for sensitivity.

Of the OP's incidents, the first is likely to be TACC related, but the second may well have been a safety feature, possibly emergency brake assist ('automatic emergency braking') triggered by an uncorrected forward collision warning event.
 
Last edited:
Now, unless i'ts just my car doing this, it seems to me that the auto braking is, in some instances 'all in' whereas it actually needs to be more progressive. I can understand the difficulty in programming that but until it's available, it's less stressful for me to just drive without any aids turned on. Of course, it may just be my car...

This of course is a really tricky problem. We all want the car to automatically brake in a true emergency, but not when we (as humans) see that this is not a true emergency. Assuming the car has to assess the situation probabilistically (likely), then how do Tesla "tune" the emergency response?

The answer has to be they tune it to bias toward false positives. That is, to occasionally brake when its not necessary to ensure that it will always brake when it has to. This is done for 3 reasons: (a) to save lives (duh!), (b) to avoid lawsuits, (c) to avoid bad publicity.

But as you note, the braking response, even allowing for conservative emergency assessment, does seem overlay aggressive, but these systems will improve I'm sure. However, I'd rather put up with that, knowing that one day that over-aggressiveness may save my or other lives.
 
  • Like
Reactions: Obliter8
However, I'd rather put up with that, knowing that one day that over-aggressiveness may save my or other lives.

I'm with you on that right up to the point where the braking is so aggressively 'over the top', that it affects following cars who either a) Can't work out why you're driving so strangely or b) think you're brake testing them.

Of the OP's incidents, the first is likely to be TACC related, but the second may well have been a safety feature, possibly emergency brake assist ('automatic emergency braking') triggered by an uncorrected forward collision warning event.

I agree the possibility but there was no way this was an uncorrected event. The truck was fully in his lane coming towards me and I in mine. The problem is on narrower two-way roads, the Tesla will 'shy-away' from large vehicles coming the other way or, as in my case, brake violently.
The braking wouldn't be a problem while the cars are learning, it's the aggressive nature of almost screeching to a halt that is the issue.

I would restate the point answered earlier, these issues are only when the car is in an automatic drive mode.
 
True of Tesla's current way, but unfortunately no sign that its tracking/predicting certain scenarios unlike some other autonomous solutions do. So a car 150 yards away crossing the carriage way will not trigger a hard brake as a human and other solutions will have worked out that by the time the car gets to that point, the danger will have passed and if it wasn't going to pass, the car would have had 150 yards to continuously reassess the situation and apply the brakes to bring the car safely to a stop.

You don't what an AP to play chicken because eventhough it often goes right when humans are stearing, it has an influance on the stocks when it goes wrong and the AP was turn on. Other systems do not reassess the situation but simply ignore it and crash when the crossing truck hasn't left the crossing when you pass.
 
We simply forget that humans in average are very bad drivers who simply ignore traffic rules because we think we know better. You want an AP that always follows the rules, no exceptions, but also takes into account the bad driving habits of the human drivers.


When you pass a semi and it turns into your lane because the driver is playing with his phone or is simply not paying attention, you what an AP that is taking evasive actions and not an AP that is gambling with your live and just hope for the best. That 9 out of 10 situation the semi goes back to it’s lane in time, is for an AP no reason to assume this will always be the case and therefore you want it to brake.