Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Phantom braking still an issue

This site may earn commission on affiliate links.
This is one of many videos I’ve supplied to Tesla and it’s made zero difference to the problem being acknowledged or fixed.

This is not something your Tesla service centre can fix, it goes to the core problems faced by Autopilot / machine vision.

I do agree Tesla service could acknowledge the problem, but I guess they are under instruction not to.

Maybe ask @karpathy on Twitter?;) He’s the Director of Tesla AI.

Or share with @greentheonly who likes to point our Autopilot failings
 
The
When my car is stationary even objects that don’t move, such as traffic lights, jump all over the screen, so it’s not surprising that other vehicles seem to act strangely when the car is moving at speed.

I suspect that anyone expecting the latest software to solve all these problems is going to be very disappointed, at least for a few years.
‘moving objects’ on the screen is probably caused by the tiniest changes in colour. A cloud moving or the scanning of the camera. Different input to the algorithm gives a different output. The car could suddenly think the object is 10cm nearer one second and render it differently in the screen and the back again. Small distance calculation differences shouldn’t affect driving behaviour but you can see how big differences in perception would (& do).
 
This is not something your Tesla service centre can fix, it goes to the core problems faced by Autopilot / machine vision.

I do agree Tesla service could acknowledge the problem, but I guess they are under instruction not to.

Maybe ask @karpathy on Twitter?;) He’s the Director of Tesla AI.

Or share with @greentheonly who likes to point our Autopilot failings
It wasn’t meant for my local service centre to fix. They were in direct contact with Tesla USA. Still after 3-4 months of dialogue and videos nothing got improved.
 
The

‘moving objects’ on the screen is probably caused by the tiniest changes in colour. A cloud moving or the scanning of the camera. Different input to the algorithm gives a different output. The car could suddenly think the object is 10cm nearer one second and render it differently in the screen and the back again. Small distance calculation differences shouldn’t affect driving behaviour but you can see how big differences in perception would (& do).

in which case their AI is dumb as bricks. If it recognises eg a truck and plots it’s trajectory/speed etc and is updating bits sensors at least every 1/60 second if not faster, it should know that truck can’t jump entirely into the next lane in 1-2 frames. It should be fast to respond but also good at rejecting false readings
 
caused by the tiniest changes in colour.
What makes you suggest that? I imagine that large objects of uniform colour close to the cameras might be mis-read, but if small changes in COLOUR effected object positioning that would be a fundamental flaw, especially when you start using object movement to predict future movement.

I suggest 'jumping' is far more linked to the way that camera images are processed / stitched and that the jumping around is when different input sources contradict. Since camera images are flat / 2D, any position / 3D data must rely on assumptions about object sizes and / or images from multiple directions. I have seen low-level repeaters on traffic lights incorrectly interpreted as additional full-sized lights in the distance.

That is an area where I have least confidence in Tesla's systems. Their output can glitch between completely different interpretations of what it sees, yet it does not see that as a warning that the system is producing erroneous / impossible data. By the time data has been processed sufficiently well to dictate the car's motion / behaviour in any way, there should be very high certainty in the validity of those actions. If the car 'knows' it is in free flowing traffic in the middle of a multi-lane freeway / motorway, any suggestion that traffic lights keep appearing and dissappearing in the middle of the roadway should be a point of concern!

Cases where last minute or sudden interventions occur based on a last minute change of interpretation (eg: ultrasonic sensor at parking speed detecting very close object) should be very rare and may be a significant cause for concern - which is not the case given how things work at the moment.

The much-discussed 'complete re-write' seems to have dissolved into nothing, especially when you look at FSD Beta videos and realise just how many problems still perpetuate between releases.
 
Last edited:
  • Like
Reactions: Wol747
No phantom braking there.
That was in response to a previous post the vehicle in lane 1 jumping into lane 2 as you pass it.

The problem with phantom braking is it follows no logic. Sometimes it will brake in instances such as this and sometimes it won’t. Sometime it happens on a stretch of road and sometimes it doesn’t. Sometimes it happens when you pass a lorry, a bridge, a shadow and sometimes it doesn’t!!

For me most of my phantom braking occurs when passing a high sided lorry or when you change from lane 3 to lane 2 with a lorry in lane 1 (as it jumps into your lane).
 
  • Like
Reactions: Wol747 and Cloggie
The problem with phantom braking is it follows no logic.

At least not logic that's easy to explain. Most likely since it is triggered by edge cases and also as a result of different input sources (radar vs visible and different camera perspectives)

most of my phantom braking occurs when passing a high sided lorry or when you change from lane 3 to lane 2 with a lorry in lane
Those seem to be the most common of the more consistent misbehaviours - certainly in my experience.
 
What makes you suggest that?
That’s just the way AI image processing works. You feed in a picture, the black box (trained on similar images) spits out an answer. In this case to recognise things like type of vehicle (e.g. lorry). A very similar but different image might give a different answer for instance 10cm vs 15cm. This in turn might mean the car thinks it’s closer than it is. Repeat lots of times and the result might be what we’re seeing.

The thing is Tesla uses layers of AI. Once you know it’s a lorry, relative position, speed, trajectory, you the have to make some decisions. (Brake?!). You could have fixed coded rules, or more AI to make that decision too. Layers of opportunity for tiny in-perceivable changes to result in different behaviour for similar (but different) situations.

It does just show how far Tesla still has to go.

A dumb adaptive cruise control = yes please.
 
  • Like
Reactions: Wol747
Yes, the inner workings of NN's are mystical. However when training a NN you would think that colour would soon drop very low down the list of significance in deciding ’not a lorry'. A NN that places high significance on colour (alone) would not do a good job of deciding what images are of.

I suspect that with very close objects, vision can lose the ability to determine shape and hence correct identification of objects. There was a post of a car glitching between seeing a truck and a wall right along side when in fact it was a curtain sider parked near.

Totally agree that while all this is 'beta' there should be a dumb mode where at least basic things work predictably. One reason SWMBO does not drive the MS is because she doesn't want the uncertainty of so much unpredictable intervention by the car.
 
  • Like
Reactions: Wol747
  • When a passenger is presentI don't use Autopilot on a road I haven't driven previously.
  • I never use Autopilot when my wife is present (has scared her witless on a couple of occasions)
  • I do use Autopilot on parts of longer motorway journeys and some dual carriageways when alone & it's mostly very good.
  • On any section where the car had previously applied sudden braking, I no longer activate it (eg A14 near Cambridge)
  • When I do use Autopilot I keep my foot just above the accelerator at all times, this isn't particularly comfortable or relaxing.
  • When about to Pass Curtain sided lorries on busy stretches I often disable it.
  • Using Autopilot keeps me at a heightened state of alertness (but not necessarily a bad thing)

    (Driving manually is more enjoyable though).
I think the issues on the A14 around Cambridge are down to the navigation being out of date. The speed limit on the screen is all over the place through there, presumably as you cross the old A-roads.
 
Have you got the 2021 map update? ie do you know if this is still a problem with the latest maps?
I'm actually driving that section on the 13th & again the 16th (on 2021 maps) but both times my wife will be with me.

Maybe I can take one for the team on the way down & try AP if the road is clear behind us. We will have been driving for at least 3 hours by then so will probably be in the dog house already if even a single note of 'my' music has emerged from the playlist.
 
  • Like
Reactions: Wol747
I was testing out TACC along the 30 mph on the way to pick up the tiddler from nursery. Cruising along quite nicely until we went past a delivery van that was parked slightly further out from the other parked cars. Car decided to panic and slam on the anchors when we were alongside. Bit late by then, I think. It does show the serious limitations of using a vision system with no memory, analysing each frame in isolation. The front camera clearly saw the van from a while back, but I’m guessing as soon as it came into view of the side repeaters it thought it was a new obstacle and panicked. Fingers crossed the big new update, which should have constantly updated probabilities is a big improvement.
I had a similar thing yesterday, the car suddenly freaked out when approaching a cyclist on a cycle path to the side of the road behind some bollards. Some part of the system suddenly decided the cyclist had ridden in front of the car (that's what the on-screen graphic showed), but in reality they just continued riding along the path.

Fortunately there wasn't any vehicles behind me so it didn't really matter, but it didn't half surprise me as the car had been behaving perfectly up until then.

As others have noted it really does seem like Tesla's FSD solution is - in the same vein as commercial fusion power - destined to be permanently six months from working. I'm sure there are a lot of very clever people working on it, but right now they haven't even managed to get automatic wipers, headlights and basic TACC working flawlessly, so it doesn't look promising.

I would like to point out that they're very much not alone in these problems, I drove my wife's iD.4 last weekend and it slammed the brakes on when approaching a roundabout because of two cars waiting in the lane next to the one I was using as I went to filter; the difference is that VW haven't based all of their hype about how brilliant their self-driving tech is, they've just built a decent car very well (impressions so far).
 
As others have noted it really does seem like Tesla's FSD solution is - in the same vein as commercial fusion power - destined to be permanently six months from working. I'm sure there are a lot of very clever people working on it, but right now they haven't even managed to get automatic wipers, headlights and basic TACC working flawlessly, so it doesn't look promising.

I think its worth noting that the behaviour of the current (8.x) FSD City Streets beta is vastly superior to what we have in our vehicles. It may not be perfect, and may never be, but suggesting a similarity with the vehicles behaviour when City Streets module is enabled with that of a vehicle without it or not enabled, or even worse, being used in an operational domain that it is not designed for, is really not a fair comparison.

The City Streets module adds so much functionality, and important functionality such as path planning,, that a non City Streets FSD vehicle is effectively a totally different vehicle in a completely different operational domain that is available to any non beta vehicle. What will be interest is how much of the City Streets functionality, or enabling functionality makes its way back into EAP/FSD Highways or basic Autopilot. Only then can some comparisons be made, even if in the mean time Tesla seem to be involved in a bit of back stepping poorly shrouded in marketing BS..
 
Yes, the inner workings of NN's are mystical. However when training a NN you would think that colour would soon drop very low down the list of significance in deciding ’not a lorry'. A NN that places high significance on colour (alone) would not do a good job of deciding what images are of.

I suspect that with very close objects, vision can lose the ability to determine shape and hence correct identification of objects. There was a post of a car glitching between seeing a truck and a wall right along side when in fact it was a curtain sider parked near.

Totally agree that while all this is 'beta' there should be a dumb mode where at least basic things work predictably. One reason SWMBO does not drive the MS is because she doesn't want the uncertainty of so much unpredictable intervention by the car.
The issue of colour is probably even more complicated! Even the human eye can distinguish between some 13m hues: coding to call something “blue green” and using that to make decisions in an autonomous car seems a bit iffy.
I would like to know how they determine distance without radar or stereo vision, too. I suppose by using a large chunk of the processing power to analyse the changes between frames it’s possible in theory to derive distance but beyond a few metres.....
I too can only try a/pilot when SWMBO isn’t with me: her squeals when the car swerves between lanes, phantom brakes etc and the subsequent rants are just too much.