Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking still an issue

This site may earn commission on affiliate links.
I don’t think it will ever be acceptable for FSD cars simply to have less accidents than human driven ones. As soon as control is taken from the passenger there is an expectation of absolute safety. Look at the millions that are spent on investigation and avoidance after train or aircraft accidents. If a bus or coach turns over it is major news and subject to investigation even if no one is seriously injured. FSD cars, if they ever happen, will need to be orders of magnitude safer than human driven cars to be accepted by users and legislators.
 
  • Like
Reactions: Wol747 and gangzoom
but at least the VW systems work pretty well with no phantom braking.

Did you read the comments following it about cars phantom braking? Its not a perfect system either

fwiw The Tesla AEB is just after the 2 minute mark. Source Euro NCAP Official Tesla Model 3 2019 safety rating. Looks like Model 3 could do with some better child protection in the rear. Frontal offset crash test offered marginal protection for larger (10year) child. Of course, the best protection is not to have the accident in the first place.

 
Last edited:
>>However while i also believe that genuine FSD is way off it is also true that it doesn't have to necessarily deal with every obvious situation properly. It only has to get itself to the state where the overall accident rate and severity of accidents is less that human driving and can then try to improve on that further. In other words if it could avoid 10 fatal accidents per period but created 9 new silly fatal accidents in that period then it has still demonstrated overall superiority.<<

I think the flaw here is that to be able to compare you have to have a statistically valid sampling - I think to get that you would have perhaps 10,000 autonomous deaths! That would go down well...

Tesla already give figures for accidents per x miles of highway driving with autonomy compared to national rates (albeit one can argue the selection for those figures).
The history of transport is littered with innovations leading to occupant deaths and then changes and improvements to minimise those. Human safety 'pilots' are there as much to reassure occupants as they are for necessity - driverless trains and buses do exist but we also have seen what happens when a large passenger aircraft crashes into an urban connurbation and indeed some cases of non terrorist suidical pilots taking their passengers with them - can't happen if there's no cockpit....
 
Tesla already give figures for accidents per x miles of highway driving with autonomy compared to national rates (albeit one can argue the selection for those figures).
The history of transport is littered with innovations leading to occupant deaths and then changes and improvements to minimise those. Human safety 'pilots' are there as much to reassure occupants as they are for necessity - driverless trains and buses do exist but we also have seen what happens when a large passenger aircraft crashes into an urban connurbation and indeed some cases of non terrorist suidical pilots taking their passengers with them - can't happen if there's no cockpit....

Dream on! Incident (not accident) reports are chock full of accidents-waiting-to-happen that have not done so because there are pilots who think around the problem. Perhaps the classic case is the Qantas SIN - SYD that lost so many systems that the computers were even unable to calculate the landing speed until the crew had told it many of the indications should be ignored.
As I said, accidents per million autonomous miles do not tell the whole story - and in fact I suspect they don't exist. Why? Because those millions of miles are driven with a safety driver. Let's see the disengagements per million miles - would be quite an eye-opener I think!
 
Dream on! Incident (not accident) reports are chock full of accidents-waiting-to-happen that have not done so because there are pilots who think around the problem. Perhaps the classic case is the Qantas SIN - SYD that lost so many systems that the computers were even unable to calculate the landing speed until the crew had told it many of the indications should be ignored.
As I said, accidents per million autonomous miles do not tell the whole story - and in fact I suspect they don't exist. Why? Because those millions of miles are driven with a safety driver. Let's see the disengagements per million miles - would be quite an eye-opener I think!

That logic would deny the possibility of driverless technology. Accidents will always happen due to machine failures but can be mitigated. I'm not suggesting that FSD is here - indeed personal opinion is that it's way way away. But i stil contend that it only has to show itself to be better than human driving rather than 'perfect'.
 
We humans use maps to plot a route but frequently the actual road differs for a variety of reasons, the simplest being a temporary blockage, roadworks etc all the way through to an entirely new & different road. In every case a careful driver should be able to assess the situation and navigate through safely.

I've used FSD frequently since 2017 on my Daughters Model X in California, Nevada and Idaho & it performs pretty well over there with mostly simple, relatively new roads (& occasional roundabouts which I have never attempted with FSD/EAP). There are still random occurrances where strange things happen requiring the driver to take immediate control.

FSD needs to get to the level of an advanced human driver for every road possibility (& every country it is going to be used in) in order to be entirely pilotless. Human ingenuity means it may well happen but almost certainly not within the lifetime of my current car and probably not during the remainder of my own life. I do hope Tesla achieve that someday though.

For that reason I see no need to spend on any more than the Autopilot that comes with my Model 3, much as I would like to. Even in the USA, Freeway navigation & the ability to impress others with what is currently possible are FSDs two greatest assets.
 
I took a run down to High Wycombe a couple of weeks ago. 6 times I had the phantom braking issue. I was trying to figure what caused it. I know about the shadows under the bridges, that was 1 time. I can't really do anything about that. but the issue that got me thinking 4 times, Is when alongside a truck.
4 times I was overtaking in the outside lane, returning to the middle lane when a curtainside truck was in the inside lane and alongside me (the other time was the same but a pickup towing a trailer with a small excavator on it)
I'm wondring if its the tails from the fastenings on the curtainside trailer that autopilot is sensing and sees it as something coming towards the car, even if it is only an inch or so, and picks it up as a threat. This could have been the case for the pickup and trailer but it was dark and I didn't see the trailer clearly to see if there was anything blowing around.
The solution I found was not to pull in nest to a curtainside truck. I only had 1 instance of phantom braking Northbound when I forgot!!
Thoughts anyone?
 
What was immediately apparent was how much more stable it was compared to my AP2.5 car - solidly in the middle of the lane passing trucks, no phantom braking.

Stable doesn't = better/more advanced.

If you look at the specs of what AP1 is operating on you will see its essentially driving blind with no idea at all whats around it or potential for a crash hence it just continues on regardless.

Ignorance as they say is bliss, which is exactly what AP1 is.
 
Last edited:
but the issue that got me thinking 4 times, Is when alongside a truck.
4 times I was overtaking in the outside lane, returning to the middle lane when a curtainside truck was in the inside lane and alongside me
<snip>
The solution I found was not to pull in nest to a curtainside truck. I only had 1 instance of phantom braking Northbound when I forgot!!
Thoughts anyone?

I think Autopilot cannot accurately tell which lane the truck is in and freaks out. Sometimes when I’m in the right most lane (lane ‘3’) overtaking a truck in the left lane (1) and you can see the visualisation showing the truck is in the middle lane.

I have the same work-around as you: get well clear of the truck before pulling in
 
  • Like
Reactions: exlatccatsa
Stable doesn't = better/more advanced.

If you look at the specs of what AP1 is operating on you will see its essentially driving blind with no idea at all whats around it or potential for a crash hence it just continues on regardless.

Ignorance as they say is bliss, which is exactly what AP1 is.
Yes, true but the BS going on with AP2.5 is wearing my patience. I currently just use the speed control and steer myself. The family hates all the sudden deceleration when I try using EAP/FSD. I’d rather have AP1 as it does what it’s remit is, even if it means it’s not accounting for what’s in the next lane by arbitrarily braking o_O
 
I took a run down to High Wycombe a couple of weeks ago. 6 times I had the phantom braking issue. I was trying to figure what caused it. I know about the shadows under the bridges, that was 1 time. I can't really do anything about that. but the issue that got me thinking 4 times, Is when alongside a truck.
4 times I was overtaking in the outside lane, returning to the middle lane when a curtainside truck was in the inside lane and alongside me (the other time was the same but a pickup towing a trailer with a small excavator on it)
I'm wondring if its the tails from the fastenings on the curtainside trailer that autopilot is sensing and sees it as something coming towards the car, even if it is only an inch or so, and picks it up as a threat. This could have been the case for the pickup and trailer but it was dark and I didn't see the trailer clearly to see if there was anything blowing around.
The solution I found was not to pull in nest to a curtainside truck. I only had 1 instance of phantom braking Northbound when I forgot!!
Thoughts anyone?

I've only have my M3 a couple of weeks but had a few of these. My theory is that the lateral movement towards the lorry in the inside lane spooks the system as it appears to be approaching.
I also had it today around a sweeping right hand bend with pedestrians on the pavement on my nearside, againI think it's not fully able to work out i'm cornering so the bodies seem to be in the straight path from that point in time
 
I had my first rather bizarre phantom braking issue today.
Driving at around 6 mph in a supermarket car park, a pedestrian appeared ahead in between two cars and rather than walking across the road in front of me decided to walk towards me slightly to then walk behind my car as it seemed a more efficient use of time etc.

The car slammed on its brakes in an emergency stop and the emergency braking alerts chimed twice slightly afterwards.

Beyond the obvious concern of why would this occur, my main problem with this is how any following cars who are typically always glued to your back bumper are ever going see and react in equal time in a scenario where there is clearly no hazard present.
 
>>That logic would deny the possibility of driverless technology. Accidents will always happen due to machine failures but can be mitigated. I'm not suggesting that FSD is here - indeed personal opinion is that it's way way away. But i stil contend that it only has to show itself to be better than human driving rather than 'perfect'.<<

It's an old trope, but still true, that any program with more than a few lines of code can contain bugs that are effectively hidden except when an unusual set of inputs or circumstances occur. I certainly agree about the timescale.

I find it rather depressing that each update seems to alter the behaviour of my car in unusual if not random ways. The last few updates have altered some of my settings, made unlocking the car a double-double press on the fob instead of a single double press (discovering this answered why I suddenly found the boot button didn't work any longer), taken away the TeslaCam in-car formatting option and one or two other things.

None of which are life threatening but disturbing in a supposedly near-FSD capable vehicle. As I have written many times, autonomous cars have more complex operations to perform than aircraft autopilots yet the industry seems to have this hubristic attitude that it's just around the corner. Aviation took 30 years to engineer, validate, test autoland starting in the sixties and that's using triplicated voting systems, licenced engineers' maintenance, heavy regulatory laws and so on.

Not trying to lecture anyone, but the thought of millions of untrained, unregulated owners rushing around in 2 tonne vehicles which have no maintenance oversight or even full documentation of updates makes me uncomfortable!
 
.....

None of which are life threatening but disturbing in a supposedly near-FSD capable vehicle. As I have written many times, autonomous cars have more complex operations to perform than aircraft autopilots yet the industry seems to have this hubristic attitude that it's just around the corner. Aviation took 30 years to engineer, validate, test autoland starting in the sixties and that's using triplicated voting systems, licenced engineers' maintenance, heavy regulatory laws and so on...

Autoland is a relatively simple process with all sorts of ground based assistance..and autoland has crosswind limits below those of manual flight!

I'd be happy if the darned car slowed down for corners it can't see round instead of ploughing on at max speed limit and then dumping control to the driver when it gets it wrong (and yes I know I'm not supposed to use it on those roads..it's called play-time and seeing if tesla have improved it yet)
 
I'm surprised these kind of comments are not far more commonplace:

unlocking the car a double-double press on the fob instead of a single double press (discovering this answered why I suddenly found the boot button didn't work any longer),

There do seem changes in operation that on the face of it are pointless and certainly confusing / annoying. I'm sure that usually there will be a reason for changes, but I do wish that before I accept changes in behavior of that type, I was given the chance to decline.

On one update a few back, I found the trunk handle operation changed . Now back to double double clicking. (I prefer easy entry / proximity disabled).

slowed down for corners it can't see round instead of ploughing on at max speed limit and then dumping control to the driver

From the outside at least, this would seem a fundamentally desirable characteristic at least while the driver is responsible.

I suspect there may be some times that slowing ahead of uncertainty may be happening, but not really driven enough to be sure. Far more evidence still of the car overreacting or ignoring hazards too late.

I wonder if released changes now are focused on non HW3 cars? That isn't evidenced by update releases, but why keep confusing us with updates and changes to a platform that is allegedly being re-written?
 
Last edited:
>>slowed down for corners it can't see round instead of ploughing on at max speed limit and then dumping control to the driver<<

That's just one example, I believe, of the absolutely fundamental problem of "FSD" (my quotes) with any present system.

The car "sees" the environment, but it doesn't "understand" anything. It reacts - very cleverly, yes, - but it doesn't have anything approaching consciousness and therefore cannot think outside the parameters it's given.

I have just treated my $8500 worth of FSD as part game, part investigation: never do I expect it to come to fruition except in certain conditions and even then not without a driver.

Jeez - even professors of neuroscience can't define or explain consciousness and I'm sure my car doesn't have it!
 
part game, part investigation

I think that really is the truth at the moment. When you look into neural nets it is very cool how they build confidence and how that confidence can get stronger, and also how that compares with human learning / understanding....

But are we really anywhere near those heady original objectives set for FSD? I don't think so, but not dissapointed too much. What a cool journey to be part of.