Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking still an issue

This site may earn commission on affiliate links.
Worth watching the Tesla Autonomy day video to see how some of this works

Skip to about 2hrs 18mins for some of the discussion of deriving depth data from vision

I agree that material should explain something, but imo when you cut to the chase, all he says is 'it just happens in the NN'. I suspect that NN's have a closer link with monkeys and typewriters than FSD owners and Tesla would want to own up to!

A while ago I found a fun demo / simulation of a NN being trained to do basic number recognition. While the NN was being trained you saw it starting to gain confidence in its recognition. When trained, it was 100% correct for all numbers, but the model showed you the %age possibility that a given written number could be each possible digit 0-9.

You could then adjust the given 'number' adding dots, smudges or deleting parts and see how the NN's predictions changed. So a small smudge near the centre of a 0 caused an increase the the possibility that the 0 was supposed to be / could be an 8. There were many predictable outcomes for other extra marks near the subject character.

What suprised me was that removing a seemingly insignificant part of the character would reduce confidence for all possibilities. Ie: the presence of the removed line was significant in confidence for all numbers.

Of course this model was super basic, but the problem I see with NN's is that it must be near impossible to know what attributes they are building up reliance upon to differentiate between objects etc.

Nature is really good at being random in truly unfathomable ways!

Then you have sensors that have unavoidable bias / perspective that could also build in unexpected dependencies / associations in the NN.

From playing with simple autonomous robots, I recognise the dilemma of more sensors giving more data but also giving greater chance of erroneous data. One infallible, universal sensor would be so much better than less consistent ones having to argue about who's right! Adding over-riding safety loops that do something in the case of a suspected system error / un-handled condition seem like a good safety feature until you realise that their predictable behaviour often comes from simplicity and a simplistic view is often not the best one!

From a geek view point it is amazing what Tesla has already done and I would say nothing against that. I did not select my MS because it has FSD, but it met all my other parameters including price so I ended up with FSD and enjoy having a small participation in the development cycle.

Regardless of the reasons behind changes in use of Radar and how that might relate to phantom braking, if Tesla Vision can get it right all the time, that would be ideal. It does feel that we are still at the very earliest stages of a potentially very long journey.
 
Last edited:
Had my car a little over two weeks now and had this braking occur twice on separate occasions. Worst one was yesterday heading north on the M40 (apologies to the BMW driver who must have been as scared as I was! ) when I overtook the BMW whilst on autopilot then indicated to pull back in and moved across the lane lines the car effectively slammed on the brakes. How I didn't end up sideways and be hit by the BMW I've no idea but that could have been a really serious incident. As far as I can tell the car is fully up to date and otherwise working fine. Very wary of autopilot now.
Typically this happens if you are moving back from lane 3 and heading towards a truck in lane 1. The car can’t know that you are going to straighten out in lane 2 before you hit the truck so it applies the brakes. You learn to only move in gently in these situations avoiding heading for trucks. Until the car is ‘FSD’ and changing lanes under its own control I don’t know how this problem can be overcome. Gave me a real shock the first time it happened to me because I was moving in sharpish because there was another car coming up fast behind in lane 3. That could also have hit me as my brakes came on.
 
I think its worth noting that the behaviour of the current (8.x) FSD City Streets beta is vastly superior to what we have in our vehicles. It may not be perfect, and may never be, but suggesting a similarity with the vehicles behaviour when City Streets module is enabled with that of a vehicle without it or not enabled, or even worse, being used in an operational domain that it is not designed for, is really not a fair comparison.

The City Streets module adds so much functionality, and important functionality such as path planning,, that a non City Streets FSD vehicle is effectively a totally different vehicle in a completely different operational domain that is available to any non beta vehicle. What will be interest is how much of the City Streets functionality, or enabling functionality makes its way back into EAP/FSD Highways or basic Autopilot. Only then can some comparisons be made, even if in the mean time Tesla seem to be involved in a bit of back stepping poorly shrouded in marketing BS..

Interesting point, I suspect that basic AP probably acts as a feeder product for EAP/FSD purchase so they might want to bring the improvements to it - would you buy FSD or EAP if they basic system didn't work? I don't think I would.
 
Until the car is ‘FSD’ and changing lanes under its own control
Well, with current FSD / AP Lane changes are under the car's control, so it should know if it plans on hitting said lorry or not.

And while it's hanging around making its mind up, the lane change timeout it ticking away, getting ready to abhort the move half way between lanes and swerve back to where you were.
 
I think its worth noting that the behaviour of the current (8.x) FSD City Streets beta is vastly superior to what we have in our vehicles.

What do the latest Beta versions run on freeway driving (in The 'States) ? Originally I thought the car switched back to something either identical or at least very similar to mainstream AP / FSD when not on city streets and would therefore experience same sort of issues as being discussed here.
 
Precisely one of the reasons why understanding how, when and where to use auto lane change makes such a big difference to the success of using EAP/FSD.
Agreed. But by the time I've determined that now is a good moment to switch lanes given the eccentricities of AP, I might as well have just made the move 'unaided'.

Similar feeling to the kids offering to help me wash the car!
 
I agree that material should explain something, but imo when you cut to the chase, all he says is 'it just happens in the NN'. I suspect that NN's have a closer link with monkeys and typewriters than FSD owners and Tesla would want to own up to!

A while ago I found a fun demo / simulation of a NN being trained to do basic number recognition. While the NN was being trained you saw it starting to gain confidence in its recognition. When trained, it was 100% correct for all numbers, but the model showed you the %age possibility that a given written number could be each possible digit 0-9.

You could then adjust the given 'number' adding dots, smudges or deleting parts and see how the NN's predictions changed. So a small smudge near the centre of a 0 caused an increase the the possibility that the 0 was supposed to be / could be an 8. There were many predictable outcomes for other extra marks near the subject character.

What suprised me was that removing a seemingly insignificant part of the character would reduce confidence for all possibilities. Ie: the presence of the removed line was significant in confidence for all numbers.

Of course this model was super basic, but the problem I see with NN's is that it must be near impossible to know what attributes they are building up reliance upon to differentiate between objects etc.

Nature is really good at being random in truly unfathomable ways!

Then you have sensors that have unavoidable bias / perspective that could also build in unexpected dependencies / associations in the NN.

From playing with simple autonomous robots, I recognise the dilemma of more sensors giving more data but also giving greater chance of erroneous data. One infallible, universal sensor would be so much better than less consistent ones having to argue about who's right! Adding over-riding safety loops that do something in the case of a suspected system error / un-handled condition seem like a good safety feature until you realise that their predictable behaviour often comes from simplicity and a simplistic view is often not the best one!

From a geek view point it is amazing what Tesla has already done and I would say nothing against that. I did not select my MS because it has FSD, but it met all my other parameters including price so I ended up with FSD and enjoy having a small participation in the development cycle.

Regardless of the reasons behind changes in use of Radar and how that might relate to phantom braking, if Tesla Vision can get it right all the time, that would be ideal. It does feel that we are still at the very earliest stages of a potentially very long journey.

The fundamental difference between a NN interpretation of an image and human vision is that the human vision was learned in conjunction with all our other sensors to give an 'understanding' of our world which actually allows us to avoid processing much of it. If we drive down a country lane vision makes us aware of hedges and trees but we don't actually 'see' them (in any detail) unless we look. Whereas a NN interpretation of an image must process every pixel from every camera. What we ascribe to intuition (and that is still a major part of driving) is the ability for bits of our brain to recognise anomalies and rapidly integrate them based (sometimes) on unrelated knowledge. Simple things like a branch swaying more than it should or more than other branches or not swaying typically might alert a driver to that branch about to break. NN gurus would call that a corner case and never add it to the system until they had many instances of such rare phenomena. Another - If you saw a kiddie playing with a ball on the pavement you'd instantly add them to 'high attention' and be ready for ball to vanish in the direction of traffic and react before the ball comes out between cars. And many of us don't just drive on vision - we use smell, hearing and tactile feedback through pedals and steering...
 
  • Like
Reactions: Battpower
Had a laugh at a busy roundabout today which I could not imagine an AI System being able to auto navigate any time soon. The minute movement of the cars, the way drivers look in the car determined whether or cars were able to pull into the roundabout.
I was thinking there is no chance for an AI system, yet.
 
I drove from Manchester to Portsmouth a couple of weeks ago. Most of the journey had autopilot and TACC engaged.

While the general drive was pleasant and enjoyable enough, half a dozen times, both ways were nothing short of bloody dangerous.

For seemingly no reason the car just slams on the brakes. You could be in the fast lane doing 70mph when this happens totally out of the blue. Sometimes it happens when over-taking. Sometimes when changing lanes and sometimes with absolutely nothing around at all including bridges or shadows.

It is wholeheartedly inexcusable for this to be happening so often. I’m in the 5th year of my 2nd Tesla and this has never ever gotten any better. If anything it’s even more violent then before.

Maybe radar and vision are conflicting, maybe not.

I wouldn’t bet more than a few pounds that this will ever get fixed.
Yes, my experience exactly. I will not take your bet as I agree with you. And then there’s the foreign exchange rate to contend with.
 
Had a laugh at a busy roundabout today which I could not imagine an AI System being able to auto navigate any time soon. The minute movement of the cars, the way drivers look in the car determined whether or cars were able to pull into the roundabout.
I was thinking there is no chance for an AI system, yet.
Agree with that. Only likely to be solved when automated systems in all cars negotiate with each other for priority.
 
That wouldn’t explain why you can be a lone warrior on the M6 and out of nowhere your car screeches to a halt from 80 to 0 in the fast lane.
Has this really happened to you? I can only envisage the car going to a total stop from 80 if steering wheel nags in AP have been repeatedly ignored. I believe that in this case the car stops, goes out of Drive and turns the hazard lights on. I’ve had phantom braking but nothing like what you describe which would frighten me silly.
 
Has this really happened to you? I can only envisage the car going to a total stop from 80 if steering wheel nags in AP have been repeatedly ignored. I believe that in this case the car stops, goes out of Drive and turns the hazard lights on. I’ve had phantom braking but nothing like what you describe which would frighten me silly.
Not to zero because I’ve always intervened. I’ve let it do it’s thing when the road it quiet and it gotten from 80 to about 30 before I stopped it.
 
Yep. I'll see if I can find the references if you are interested, but the more you see videos, even those of the latest FSD City Street beta, or latest TeslaVision version, you see trucks swerving lanes as they come along side. I had hoped that this issue would be solved with latest vision, but I am now not holding my breadth.

One thing you do learn is not to cross back into a left hand lane if you are approaching a truck to the left of that lane. That truck will often swerve into the lane that you are merging into and the car will panic.

Update with a video of this occurring even on latest and greatest builds. Pick it up at around the 7 minute mark and look at the visualisations of the trucks being passed. Not quite the scenario above, but I think it demonstrates the issue of objects moving when being passed.
I watched this with interest and was concerned. It struck me that whilst this car has no radar the whole vision only software probably hasn’t been released to it yet.