Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

First experience with TACC failure that almost caused an accident

This site may earn commission on affiliate links.
I have a humble suggestion: let's put our software version in our Sig
so we don't have to keep wondering which version a given person's car is on.

FWIW I've felt a bit nervous with the more aggressive AP in this update,
especially when I double tap but then realize I'm only in TACC, and the
car is hauling ass towards stopped traffic - will it slow down? Will it stop?
-- time to hit the brakes. Blood pressure spiking.
 
Last edited:
Tesla’s inability to detect stationary objects is very concerning. The truck here visually blends into the horizon and sky. The radar just does not detect it. Broken down cars at the side of the road never appear to be detected and autopilot gets way to close to them. It seems like they just can’t detect a stationary object with out a lot of phantom breaking issues.
 
  • Like
Reactions: Scott7 and OPRCE
Everyone needs to keep in mind that the radar in a Tesla is designed to see only moving objects. That makes the car very reliant on the cameras for avoiding non-moving objects. I'm just guessing, but my thought is that radar saw nothing moving and the camera just saw empty space ahead because the truck and background are all close to the same shade of grey/white. I bet if the truck had even been going 1 mph this would not have happened. It's an interesting problem from a programming perspective.
 
Based on the responses I've seen so far, I tend to agree that it was not the radar that lost sight of the car but rather the visual system (front cameras) due to poor contrast between truck and surroundings. To the naked eye, the contrast was just fine, but cameras don't have comparable dynamic range as our eyes, as was evident from the camera on my phone. The AP camera likely saw something similar to my phone.

Also, SMH, I totally forgot to have the dashcam save a snapshot of the event. Just thinking of this now.

For those who missed it in the OP, I was running 2019.8.5. But it likely wasn't firmware-specific; the low contrast weakness has been around for years. It was one of the contributing factors in the Joshua Brown fatal crash.
 
I have a humble suggestion: let's put our software version in our Sig
so we don't have to keep wondering which version a given person's car is on.

FWIW I've felt a bit nervous with the more aggressive AP in this update,
especially when I double tap but then realize I'm only in TACC, and the
car is hauling ass towards stopped traffic - will it slow down? Will it stop?
-- time to hit the brakes. Blood pressure spiking.

Would the SW ver in the sig be useful to for something like the context of this thread? As soon as the OP gets a new version and updates his signature, it wouldn't be applicable anymore to the discussion. It's not going to stay static to what it was when the thread was started.
 
  • Like
Reactions: OPRCE
Everyone needs to keep in mind that the radar in a Tesla is designed to see only moving objects. That makes the car very reliant on the cameras for avoiding non-moving objects. I'm just guessing, but my thought is that radar saw nothing moving and the camera just saw empty space ahead because the truck and background are all close to the same shade of grey/white. I bet if the truck had even been going 1 mph this would not have happened. It's an interesting problem from a programming perspective.

Continental claims their ARS-410 radar in HW2.5 can distinguish stationary objects from the background:
"The Advanced Radar Sensor 410 realized a broad field of view by two independent scans in conjunction with the high range functions like Adaptive Cruise Control, Forward Collision Warning and Emergency Brake Assist can be easily implemented. Its capability to detect stationary objects without the help of a camera system emphasizes its performance. The Advanced Radar Sensor 410 is a best in class radar, especially for the stationary target detection and separation."

I mean the perpendicular metal rear of that truck has to return a very clear signal at that range, and a very different one from the open road, so it cannot be too difficult. To me it looks like the radar signal is simply being ignored here, then the optical recognition fails simultaneously.

PS: Just found this from former Tesla and BMW Engineer, Michael Barnard:
Tesla’s radar component is set to be more a forward but low sensor, without apparent identification of anything above the height of the hood.

Presumably it relates to the AP1 setup he was analysing (J.Brown crash) but still It does not make much sense to me, what do you think?
 
Last edited:
  • Informative
Reactions: C141medic
Tesla’s inability to detect stationary objects is very concerning. The truck here visually blends into the horizon and sky. The radar just does not detect it. Broken down cars at the side of the road never appear to be detected and autopilot gets way to close to them. It seems like they just can’t detect a stationary object with out a lot of phantom breaking issues.

Tesla needs to implement some physical constraints on the movement of nearby cars, especially directly ahead. Cars don’t just blink in and out of existence. This system is truly beta and should be treated as such.
 
Tesla needs to implement some physical constraints on the movement of nearby cars, especially directly ahead. Cars don’t just blink in and out of existence. This system is truly beta and should be treated as such.

I think the real issue here is that the Tesla vision system seems to be stateless, meaning it can change it's mind radically at any time. Human's don't work like that. If I saw a truck in front of me a moment ago, then unless I see the truck move, I still know it's there.

Perhaps some sort of averaging of the states with a timeout could work in this situation. Clearly Autopilot saw the truck for an extended period of time, along with traffic all around it, if it all of a sudden disappears, there should be an averaging of states that lasts for a few seconds before the car is 100% sure it's gone.
 
Here's the moment the truck disappears.
A few observations:
The taillights merge with the cars in the two adjacent lanes (from the perspective of the autopilot cameras).
The black part of the truck is probably low enough that the computer ignores it or thinks it's a dark mark on the road.
The lift gate is about the same color as the road.
The top of the truck is about the same color as the sky.
Computer vision is really hard and that's why AP is in beta.
View attachment 395165

This exactly. It was a perfect scenario to confuse AP. Scary but reality is computers can get confused, and we've seen situations a lot with overpasses. I'll always keep my eyes on the road, but it's getting harder to stay 100% focused now that AP is so good compared to what it was a year ago.
 
I think the real issue here is that the Tesla vision system seems to be stateless, meaning it can change it's mind radically at any time. Human's don't work like that. If I saw a truck in front of me a moment ago, then unless I see the truck move, I still know it's there.

Perhaps some sort of averaging of the states with a timeout could work in this situation. Clearly Autopilot saw the truck for an extended period of time, along with traffic all around it, if it all of a sudden disappears, there should be an averaging of states that lasts for a few seconds before the car is 100% sure it's gone.

This is a good idea in one instance and a terrible idea in another. Objects that existed a moment ago should not disappear that’s plausible but should an object that just appears in front of the car that was not there a moment ago be ignored.
 
Here's the moment the truck disappears.
A few observations:
The taillights merge with the cars in the two adjacent lanes (from the perspective of the autopilot cameras).
The black part of the truck is probably low enough that the computer ignores it or thinks it's a dark mark on the road.
The lift gate is about the same color as the road.
The top of the truck is about the same color as the sky.
Computer vision is really hard and that's why AP is in beta.
View attachment 395165

I think that analysis is spot on. The thing that troubles me is the inability of the AP system to rely on additional input from the radar in making this determination. Anybody who has taken a photography course and played with a camera knows that it's quite easy to "fool" a camera with contrast/perspective tricks and my understanding that the forward radar is supposed to provide a secondary forward collision prevention capability for exactly scenarios like this.

Everyone should also think of the reaction that a report like this would have gotten in this forum if the OP didn't have video evidence of what it was doing. More than likely he would have been mocked & ridiculed.

Tesla needs to put more work into this, especially with the higher confidence that the AP system is trying to demonstrate in the 2019.8.5 NOA update. When using this system myself over the weekend I found its ability to navigate traffic and change lanes rather remarkable. Since I'm a skeptic I kept a wary eye on it the whole time but it would be unremarkable for someone who gets comfortable with its operation to tune out for a while and then if you have a situation like seen in this video then you are in a high speed collision with the car in front of you.
 
Tesla needs to implement some physical constraints on the movement of nearby cars, especially directly ahead. Cars don’t just blink in and out of existence. This system is truly beta and should be treated as such.

I think the real issue here is that the Tesla vision system seems to be stateless, meaning it can change it's mind radically at any time. Human's don't work like that. If I saw a truck in front of me a moment ago, then unless I see the truck move, I still know it's there.

Perhaps some sort of averaging of the states with a timeout could work in this situation. Clearly Autopilot saw the truck for an extended period of time, along with traffic all around it, if it all of a sudden disapears, there should be an averaging of states that lasts for a few seconds before the car is 100% sure it's gone.
This is a good idea in one instance and a terrible idea in another. Objects that existed a moment ago should not disappear that’s plausible but should an object that just appears in front of the car that was not there a moment ago be ignored.

Fair point. It should obviously be biased torwards not forgetting about objects it has seen vs seeing new objects.
 
I think that analysis is spot on. The thing that troubles me is the inability of the AP system to rely on additional input from the radar in making this determination. Anybody who has taken a photography course and played with a camera knows that it's quite easy to "fool" a camera with contrast/perspective tricks and my understanding that the forward radar is supposed to provide a secondary forward collision prevention capability for exactly scenarios like this.

I have a sinking feeling that in this case, Tesla's AutoPilot system chose to ignore the warnings from the radar and overrode its input with the analysis of visual input data.

Perhaps a reason why Tesla would choose to do this may be because purely relying on radar inputs leads to a large number of false positives - which would then create phantom breaking situations and other complications (I've watched some of the radar data represented visually, it is interesting to see.)

If Tesla is overriding radar input with visual data, it could explain what happened here.

I've gone through some of the videos posted by @verygreen including this one which recorded about 10 minutes of driving in the rain. In the latter part of the video, there are several occasions where you can see the trucks 'blinking in and out' of existence.


What is extraordinary in this thread however is the length of time the truck was not visible, and at one point, the degree to which it believed it had moved over to the side.

This is clearly a work in progress, and I don't get the sense that this has been a new problem.
 
  • Helpful
Reactions: OPRCE
Human's don't work like that.
They do until they're 8-12 months old :p
Object permanence - Wikipedia
  1. 8–12 months: Coordination of Secondary Circular Reactions – This is deemed the most important for the cognitive development of the child. At this stage the child understands causality and is goal directed. The very earliest understanding of object permanence emerges, as the child is now able to retrieve an object when its concealment is observed. This stage is associated with the classic A-not-B error. After successfully retrieving a hidden object at one location (A), the child fails to retrieve it at a second location (B).[8]
 
Shouldn’t the automatic emergency braking apply in this situation?
From the manual:

"Automatic Emergency Braking is designed to automatically brake in situations where a collision is considered imminent (see Automatic Emergency Braking on page 103). Warning: Automatic Emergency Braking is not designed to prevent a collision. At best, it can minimize the impact of a frontal collision by attempting to reduce your driving speed. Depending on Automatic Emergency Braking to avoid a collision can result in serious injury or death"

"When a frontal collision is considered unavoidable, Automatic Emergency Braking is designed to apply the brakes to reduce the severity of the impact."

"Automatic Emergency Braking operates only when driving between approximately 7 mph (10 km/h) and 90 mph (150 km/h)."
 
I have a sinking feeling that in this case, Tesla's AutoPilot system chose to ignore the warnings from the radar and overrode its input with the analysis of visual input data.
If Tesla is overriding radar input with visual data, it could explain what happened here.
Of course they override the radar, otherwise the car would be slamming on the brakes all the time due to false positives.
 
Of course they override the radar, otherwise the car would be slamming on the brakes all the time due to false positives.

The other option, of course, is that the radar system failed, but I believe that Tesla identifies this issue explicitly with a warning "reduced front radar visibility" which I've gotten on several occasions, and causes AutoSteering and TACC to be immediately disabled.

Without actually seeing the data itself, it is hard to know with certainty, but I'd agree that this is likely what is going on here.

Overriding sensor data is a very tricky thing - pilots in fighter aircraft are taught to rely on their sensors above their own analysis.

Overriding false positives naturally means you'll end up with false negatives as well.

I have been on alert during my driving sessions with autosteering after a few incidents this year; the moment I get comfortable with it, I find situations where it feels like its either regressed or should have done considerably better.