Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
@KArnold - why do you disagree with this explanation?
Road conditions were perfect. Sunny, no cars around, 2 lane highway, straight (no curves or hills) flawless pavement (no color change, cracks NOTHING) no bridge, nothing around, no clouds in the sky. S
That's a lot of somethings for FSD to see and work with. There's a myriad of clues in this case, There's no excuse to PB here. PB is just bugs in the software. I'm confident someone will get there some day.
 
Last edited:
  • Like
Reactions: DrChaos
That's a lot of somethings for FSD to see and work with. There's a myriad of clues in this case, There's no excuse to PB here. PB is just bugs in the software. I'm confident someone will get there some day.
It has bugs for sure but the principle is valid.

Let’s say your vision can only see 200 feet ahead. If you cannot see anything ie no vehicles, would you just keep going? Or would you slow down to make sure?
 
  • Like
Reactions: DrChaos
Let’s say your vision can only see 200 feet ahead. If you cannot see anything ie no vehicles, would you just keep going? Or would you slow down to make sure?
I'm very confused. What is the purpose of this slowing? To avoid hitting cars that are there but that the software somehow missed? Is the software correctly perceiving the environment but not the cars a likely failure mode? Should the software slow the car when it sees no pedestrians or other important categories of objects?
 
The just released FSDb is much better in highway driving in my experience, though I have no opportunity to test on desolate car-less 2-lane roads. It's possible that stream, when released to mainstream, will perform better but there's no guarantee.
That’s the version I am on too - unfortunately it’s still very prevalent in the exact conditions you’ve described.

Today I’m heading into the city, and the roads have been swept so lines are visible. I’m going to hit some multi lane roads so I can do a good camera recalibration. Hopefully this will help as I haven’t been able to recalibrate since fall due to winter roads.
 
I'm very confused. What is the purpose of this slowing? To avoid hitting cars that are there but that the software somehow missed? Is the software correctly perceiving the environment but not the cars a likely failure mode? Should the software slow the car when it sees no pedestrians or other important categories of objects?
If you cannot see anything, is that a good reason to keep moving forward? I agree that needs to be refined to have a better approach to evaluating that condition
 
Happens with non-divided highways, when there is an undulation in the elevation and it can't extend the road boundaries to infinity with perfect perspective. A hill alone (where nothing is visible after a certain distance) isn't enough, that case is covered. It's when there is a hill, an invisible depression, and then the road continues behind but from the point of view of camera there is a discontinuity in the road boundaries. I think to the computer it looks like an object suddenly appeared in the middle of the road and it panics.

If there was no hill then the other case is a mirage, where reflection of something above or behind looks like it's projected in the road surface and it similarly panics. You said it was sunny and no cars around. Black asphalt? That's the conditions that make for a mirage that isn't broken up by other cars, both visually or with ground level turbulence that disrupts the stratified temperature/density gradient of the air.

The problem is that the computer doesn't have a long-term memory or understanding to know that an image which just popped up on the road it's been watching for the last minute is a mirage, where a human, or any animal with vision, would instinctively know that because it has a better cognitive world model.

The just released FSDb is much better in highway driving in my experience, though I have no opportunity to test on desolate car-less 2-lane roads. It's possible that stream, when released to mainstream, will perform better but there's no guarantee.
This makes sense. But why did mine work flawlessly from Sept 2022 until last month, over about 4 updates? Drove through rain, severe mirages, crests of hills, bad shadows, low sun in my face, literally everything. 10k mikes, not a single PB.

Now it's back to its old tricks. There must be some soft settings in there that can be just right, but got reset by a service or recent update (had both the same week).
 
If the car can see clearly and there are no obstructions then, yes, it continues forward. That's because it has established that there are no obstructions (as opposed to not knowing whether there are any). I would have thought this to be obvious, which is why I was confused.
When you crest a hill where you can't see the other side, do you go at full speed even though you can't see, or do you slow down a bit to be cautious? The car is using the same logic. If it can't be sure if there is a object there, it makes sense to slow down.

Tesla's system isn't trying to be a "dumb" L2 system, where it errs against false positives, it's trying to be more cautious than that, especially because of well publicized accidents (some of which were fatal).
 
  • Like
Reactions: gtg465x and enemji
When you crest a hill where you can't see the other side, do you go at full speed even though you can't see, or do you slow down a bit to be cautious?
"If the car can see clearly"

This exchange was launched from someone posting that they had clear conditions all around and still they'd get phantom braking.

But speaking of braking at the crest of a hill, I've been complaining about this for a bit now. It's one thing to smoothly slow because of a coming crest in the road. It's another to suddenly brake upon reaching the crest. It's like FSDb's old habit of braking hard after passing a pedestrian, or the current habit of braking while negotiating a curve. There's something screwy going on in the software.
 
If the car can see clearly and there are no obstructions then, yes, it continues forward. That's because it has established that there are no obstructions (as opposed to not knowing whether there are any). I would have thought this to be obvious, which is why I was confused.
How do you determine there are no obstructions? Simply because you cannot see anything? It is not that obvious.
 
If you cannot see anything, is that a good reason to keep moving forward? I agree that needs to be refined to have a better approach to evaluating that condition

The common case is simultaneous inability to extend road boundaries defining drivable area sufficiently forward, and the appearance of some image it can't categorize in the roadway, nor determine its distance to with sufficient security and accuracy. Works better following other cars: cars have a pretty well known rear width, and with standardized license plates a physically calibrated known distance. So if it sees a car in front it knows approximately where it is, and since that car didn't crash it also can tell the car in front is moving at similar speed and driving through the space so it's more confident it can do the same.

PB is infrequent enough that it's an effect in the tails of the probability distribution. It still works on most of the cases, but this is a circumstance where 99.99% isn't good enough. One in ten thousand seconds is 2.7 hours.

Turning off sensitivity to this effect increases the likelihood of running into parked cars or trucks partially in the roadway, which was previously a common failure. Phantom braking is better than real collisions.

The system could be doing as well as can be expected given its inputs---which is why people debate the consequences of a low fidelity and low cost sensor set.
 
  • Like
Reactions: enemji
They still didn't fix one of the most annoying phantom braking issues for me which is slamming the brakes whenever the car sees a blinking yellow light, of which there are many around here on speed limit signs etc.. quite sketchy on the highway. I've basically stopped using it.
Yet another tradeoff---the system increased the sensitivity to blinking yellow lights from emergency vehicles (tow/fire/police trucks) on the roadside, as it previously smashed into some of them. Some of that effect leaked into other behavior----as a technical matter neural network classifiers will naively respond to features without assuming cross correlation (i.e. this AND that).

And blinking yellow lights on speed limit signs are almost absent from California...
 
The common case is simultaneous inability to extend road boundaries defining drivable area sufficiently forward, and the appearance of some image it can't categorize in the roadway, nor determine its distance to with sufficient security and accuracy. Works better following other cars: cars have a pretty well known rear width, and with standardized license plates a physically calibrated known distance. So if it sees a car in front it knows approximately where it is, and since that car didn't crash it also can tell the car in front is moving at similar speed and driving through the space so it's more confident it can do the same.

PB is infrequent enough that it's an effect in the tails of the probability distribution. It still works on most of the cases, but this is a circumstance where 99.99% isn't good enough. One in ten thousand seconds is 2.7 hours.

Turning off sensitivity to this effect increases the likelihood of running into parked cars or trucks partially in the roadway, which was previously a common failure. Phantom braking is better than real collisions.

The system could be doing as well as can be expected given its inputs---which is why people debate the consequences of a low fidelity and low cost sensor set.

Yet another tradeoff---the system increased the sensitivity to blinking yellow lights from emergency vehicles (tow/fire/police trucks) on the roadside, as it previously smashed into some of them. Some of that effect leaked into other behavior----as a technical matter neural network classifiers will naively respond to features without assuming cross correlation (i.e. this AND that).

And blinking yellow lights on speed limit signs are almost absent from California...
Both of these are valid points - for when FSD / Autopilot is engaged.

I wish they'd "dumb it down" when just cruise control is engaged. Having to step all the way down to having no cruise control at all when PB conditions exist is a real hard pill to swallow in a $130K car.

All I want is a dumb cruise control option. Turn off TACC, turn off everything, just give me dumb cruise.
 
  • Like
Reactions: JB47394
Both of these are valid points - for when FSD / Autopilot is engaged.

I wish they'd "dumb it down" when just cruise control is engaged. Having to step all the way down to having no cruise control at all when PB conditions exist is a real hard pill to swallow in a $130K car.

All I want is a dumb cruise control option. Turn off TACC, turn off everything, just give me dumb cruise.
That is like asking for a human being that is no smarter than a monkey
 
  • Disagree
Reactions: sandman1330
That is like asking for a human being that is no smarter than a monkey
For some jobs, a monkey could do it better than a human.

It's very simple: TACC doesn't work where I live. Give me an option to utilize cruise control without TACC, at least until (maybe some day) they sort out TACC for ALL LOCATIONS AND CONDITIONS.

Funny I never had these issues when my radar was active.... But radar did have it's own issues in other locations and conditions, just never for me.
 
  • Like
Reactions: ItsNotAboutTheMoney