Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking still an issue

This site may earn commission on affiliate links.
I drove on Autopilot on the A14 through the new A1/Cambridge section yesterday & the car started to brake heavily from 72 mph at one point, virtually an emergency stop. Luckily nothing was following & I took control almost instantly. The sky was grey & I wasn't driving under anything so no shadows, straight piece of road, one car a long way in front & nothing visible in the rear view mirror.

Google maps navigation showed the new road correctly so I can only assume that the car somehow picked up a former road (or a tiny bit from a construction contraflow) in its path at that point. On other roads I occasionally notice that the speed symbol on the screen is different to the actual road.

It begs the question - what does Tesla actually use to determine a roads' speed?

This was part of a mammoth journey from Chester via M6/A14 to assist my Father (outdoors) near Bury-St-Edmunds, supercharging at Elveden for the return - around 11 hours driving including the supercharger stop.

At one point near the Gravelly Hill Interchange when using Autopilot/Autosteer in heavy traffic the moving graphic display locked up & Autopilot disengaged for a couple of minutes - looked like the car lost sight of the lane lines so my car and those around remained static on the screen for about 1/2 mile even after the traffic spread out and we moved on. It then corrected, I switched back to Autopilot & everything was fine. I hope that's a one-off.

.....It's made me much more wary about these sort of things happening when using Autopilot.

(if anyone here driving a blue LR, followed by a blue M3P passed a white LR, M6 near Coventry yesterday morning, that was me!)

I've had the same issue on the A14 new section, thought it might be related to Google maps not updated to show the new road, also the M40 on the way to London, southbound and northbound carriageways near Oxford services.
 
I don’t understand how phantom braking adds to safety? Surely phantom braking is braking for a hazard that doesn’t exist, so doesn’t that indicate that it’s not functioning as it should?

For all the talk of FSD, phantom braking is clear real life evidence the 'vision' these cars have in real life is very limited. It might seem easy for us to tell the difference between a bill board 800 meters away above/near the road and a stationary truck, but for a computer its incredibly hard, even with radar or LIDAR.

The killer Uber car a few years ago actually 'saw' the pedestrian crossing the road, but Uber had deactivated self braking at high speed, presumably because there were too many 'false positives' events. Phantom braking is actually by far the safest way for a system to proceed if its unsure, but us humans find it annoying to have sudden deceleration events every time the system develops uncertainty.

Phantom braking is Tesla's way to give the AP computer time to 'check' if the object ahead it thinks is a real life obstacle or not.

Just because you don't see an obstacle doesn't mean AP doesn't see one. From what I understand AP forms its vision from sampling very small pixels areas, so even a random bit of dirt on the camera could trigger an phantom braking event.

Ultimate what's worse news for Tesla, more phantom braking events or another death because the car failed to see a clear obstacle in the road?

FSD is still a long long away, can any one here seriously put the lives of their families in hands of a system that still cannot 'see' a over turned truck blocking the road in board day light??!!
 
Last edited:
For all the talk of FSD, phantom braking is clear real life evidence the 'vision' these cars have in real life is very limited. It might seem easy for us to tell the difference between a bill board 800 meters away above/near the road and a stationary truck, but for a computer its incredibly hard, even with radar or LIDAR.

The killer Uber car a few years ago actually 'saw' the pedestrian crossing the road, but Uber had deactivated self braking at high speed, presumably because there were too many 'false positives' events. Phantom braking is actually by far the safest way for a system to proceed if its unsure, but us humans find it annoying to have sudden deceleration events every time the system develops uncertainty.

Phantom braking is Tesla's way to give the AP computer time to 'check' if the object ahead it thinks is a real life obstacle or not.

Just because you don't see an obstacle doesn't mean AP doesn't see one. From what I understand AP forms its vision from sampling very small pixels areas, so even a random bit of dirt on the camera could trigger an phantom braking event.

Ultimate what's worse news for Tesla, more phantom braking events or another death because the car failed to see a clear obstacle in the road?

FSD is still a long long away, can any one here seriously put the lives of their families in hands of a system that still cannot 'see' a over turned truck blocking the road in board day light??!!

I've experienced two episodes of severe phantom braking i.e. not just slowing down but slamming the brakes on for no apparent reason at motorway speed (no gantry, not overtaking a lorry). One ocassion wasn't an issue but the other time it was only the excellent reflexes of the guy behind that avoided a rear-ending (you could argue he was too close but we all know that part of following someone involves looking ahead of them too).

However while i also believe that genuine FSD is way off it is also true that it doesn't have to necessarily deal with every obvious situation properly. It only has to get itself to the state where the overall accident rate and severity of accidents is less that human driving and can then try to improve on that further. In other words if it could avoid 10 fatal accidents per period but created 9 new silly fatal accidents in that period then it has still demonstrated overall superiority.
 
>>However while i also believe that genuine FSD is way off it is also true that it doesn't have to necessarily deal with every obvious situation properly. It only has to get itself to the state where the overall accident rate and severity of accidents is less that human driving and can then try to improve on that further. In other words if it could avoid 10 fatal accidents per period but created 9 new silly fatal accidents in that period then it has still demonstrated overall superiority.<<

I think the flaw here is that to be able to compare you have to have a statistically valid sampling - I think to get that you would have perhaps 10,000 autonomous deaths! That would go down well...
 
To be fully autonomous 'full self driving' any manufacturer would have to aim for 100% success just as the aircraft industry does. Even the most advanced passenger aircraft require a pilot/co pilot to be available to take over at any time so I just don't see how any software controlled vehicle can achieve total autonomy given the almost infinite number of real world situations that exist, especially in countries like the UK.

There will be plenty of critics out there ready to jump on every fatality and this will probably resonate with large numbers of ordinary motorists. I am not a car lover by any stretch of the imagination but have always been fascinated by tech - my Model 3 is one of the best things I have ever owned and it constantly fascinates and surprises me. I doubt the majority of drivers out there see a Tesla as more than just a car and any publicised information about technical failings will make them wary about the brand.
 
Our 2016 VW Golf 1.4 SE has adaptive cruise control as standard. Radar based. It works perfectly and in four years of ownership we’ve never experienced phantom braking.

You were lucky. Brother in laws Golf slammed on for a plastic bag blowing in the wind. It also wanted to auto park into a tree, not at the same time, obvs.

Its a very hard problem to solve and all solutions are imperfect, just like humans.

Take a human out of the equation, you remove the root cause of 90% of all accidents. Its what you replace the human with counts.
 
  • Informative
Reactions: gangzoom
Our 2016 VW Golf 1.4 SE has adaptive cruise control as standard. Radar based. It works perfectly and in four years of ownership we’ve never experienced phantom braking.

Same with two Audi A6s. It’s also much smoother in the way it brakes and accelerates. In other words it feels like an experienced driver is in control rather than a 17 year old on their second driving lesson.
 
Our 2016 VW Golf 1.4 SE has adaptive cruise control as standard. Radar based. It works perfectly and in four years of ownership we’ve never experienced phantom braking.

That doesn't mean it's better, which is a common assumption people make when comparing AP1 hardware to the current AP hardware/software.

The adaptive cruise control system in your 2016 Golf will have zero chance of avoid a crash with a stationary object when the closing speed is greater than 30mph.

The less info/sophisticated a system the less chance of anything going wrong, but Tesla is pushing for FSD, which takes things on to a different level.
 
Same with two Audi A6s. It’s also much smoother in the way it brakes and accelerates. In other words it feels like an experienced driver is in control rather than a 17 year old on their second driving lesson.

It's an illusion of control when actually there is very little 'experience' and zero intelligence behind the current crop of driver assistance features.

The fact Tesla has taken so long to even get rain wipers to work via true understanding of rain rather than just rely on sensor shows how hard it is to replicate any form of intelligence.
 
What folks are noting is that the basics feel sub-standard compared to well established systems in the cars Tesla is trying to get people out of.

Interesting debate - better to be pursuing the next levels above but miss out the first levels quality, or get quality right at lower levels then build upwards ? Tough one.

I have had phantom braking on a 2016 Passat (nearly always low/long trailers) when overtaking and also a 2018 BMW 6 GT (nearly always lorries), it is terrifying and really unfair on people following, plus you look and feel like an idiot. On both cars it made me reduce the scenarios where I put the TACC equivalents on.

If my LR does it regularly I will just have to use the systems less as it’s impossible to relax/concentrate on wider road hazards if your brain is constantly trying to predict a localised braking event that hasn’t happened yet. Tesla do need to get it all right, not just the stuff nobody else has done yet.
 
Radar is great for measuring relative speed differences and if following the car ahead was the only requirement then radar would be the perfect solution, it is because Tesla are trying to do so much more that we have the 'phantom' problems.

Tesla's stance has long been that they don't need LiDAR and can use vision to mimic the way human drivers work. Radar isn't much help as there are too many stationary objects around so they have to declutter the radar returns by ignoring things that have not been observed moving, LiDAR would be more useful here in building a model of what it going on around the car, but without it we are dependent on machine vision and that is taking a lot of time to train, even with the huge volumes of data that Tesla get from the cars.

It is typical of Tesla to stick to a viewpoint regardless of the time it takes. Auto-wipers could have been near perfect from day 1 with a rain sensor, but they remain a work in progress as they teach the car to recognise what rain looks like through a camera...
 
What folks are noting is that the basics feel sub-standard compared to well established systems in the cars Tesla is trying to get people out of.

That’s the frustrating thing. Teslas are touted as being among the most innovative and technically advanced cars available, but when you get one you find that things you have taken for granted for years just don’t work properly. Tesla’s innovation is commendable, but do they need to reinvent the wheel as well?
 
  • Like
Reactions: MrBadger
Tesla’s innovation is commendable, but do they need to reinvent the wheel as well?

The thinking the theory is if a car is to be truly autonomous and FSD, than the car has to understand the world around it in real time like a human would. So regardless of what you use for wiper activation the car needs to understand rain and how heavy it is. Equally for adaptive high beam, the car needs to know there is a car coming towards regardless of what a light sensor might say.

Tesla has been hinting the limitation of AP currently is the AP2.0/2.5 hardware rather than the code. So far AP 3.0 hardware doesn't appear to be doing anything extra versus 2.0/2.5, the much touted code 'rewrite' is suppose to bring a step change in features.

So we'll all have to wait and see.
 
The adaptive cruise control system in your 2016 Golf will have zero chance of avoid a crash with a stationary object when the closing speed is greater than 30mph.

For that, VW use Front Assist / Emergency Autonomous Braking which operates whether or not Adaptive Cruise Control is engaged. Here’s a test:
https://www.google.co.uk/url?sa=t&r...=DvbpS2a4aLg&usg=AOvVaw2SE7EDAigNcXQfBTJcddm_

Maybe not as good as the Tesla system - I don’t know - but at least the VW systems work pretty well with no phantom braking.
 
  • Like
Reactions: Artiste