Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 10.69

This site may earn commission on affiliate links.
Multiple safety disengagement (including running red light)


@AlanSubie4Life: a while ago we were talking about the stopping behavior between stop signs and red lights. I forget exactly what we were discussing, but in this video, at 2:35, you can see a pretty big difference between how it handles coming to a stop for a stop sign vs at 3:15 when the car stops for the red light.

It seems so strange to me why the behaviors need to be different. Both sign and signal typically have a stop line (although all bets are off where I live for either). But in my experience, slowing for a red light has almost always been very human-like and smooth, whereas it's all over the place for stop signs. You might have had a poorer experience with red lights.
 
In this particular case, the traffic lights are awfully hung. I could easily see human drivers being confused about whether the light is red or green:

Screenshot_20220908-163416-599.png
 
But in my experience, slowing for a red light has almost always been very human-like and smooth, whereas it's all over the place for stop signs.
Definitely worse for stop signs.

Lights can be ok (even good! - had a couple of good results this morning), but sometimes are not. Hard to predict the exact conditions that yield different results.
 
  • Like
Reactions: FSDtester#1
@AlanSubie4Life: a while ago we were talking about the stopping behavior between stop signs and red lights. I forget exactly what we were discussing, but in this video, at 2:35, you can see a pretty big difference between how it handles coming to a stop for a stop sign vs at 3:15 when the car stops for the red light.

It seems so strange to me why the behaviors need to be different. Both sign and signal typically have a stop line (although all bets are off where I live for either). But in my experience, slowing for a red light has almost always been very human-like and smooth, whereas it's all over the place for stop signs. You might have had a poorer experience with red lights.
Great observation. Stop light behavior is generally spot on with both stopping and starting, though vague crosswalks can make it stop kind of short for some lights here.
 
@AlanSubie4Life: a while ago we were talking about the stopping behavior between stop signs and red lights. I forget exactly what we were discussing, but in this video, at 2:35, you can see a pretty big difference between how it handles coming to a stop for a stop sign vs at 3:15 when the car stops for the red light.

It seems so strange to me why the behaviors need to be different. Both sign and signal typically have a stop line (although all bets are off where I live for either). But in my experience, slowing for a red light has almost always been very human-like and smooth, whereas it's all over the place for stop signs. You might have had a poorer experience with red lights.
I'm curious if signals are visual, but stop signs are both visual and map-based. There have been people who've said their Tesla stopped for a stop sign that didn't exist visually, but must have been in older map data. I think I also recall someone saying their Tesla ran through a stop sign, possibly because it's a newer sign that wasn't in the map data - but I may be mis-remembering this one.
 
Failures with no explanation are much worse than failures with simple explanations. In this case, it's a simple explanation: FSD doesn't consider traffic light facing-angles, and saw the green light over its lane.

The green might have been closer to the centerline of the lane than the red and the matching algorithm of light to lane took that, which is usually the right thing to do.

I've never seen an intersection with lights as poor as that in California---which is a bulk of Tesla-driven miles and probably real world extracted training data. If this were first time at the intersection (and the autopilot has no memory) I could have made a mistake.

From a machine learning model, there is probably no category for "visible light, but intended for a different direction" label. And if there were one, since the 'visible light for my direction' will so far out number that one, there could be a significant cost induced by including it, in false positives for the 'other lane' category causing the system to erroneously miss one's own light. That frequency could be much greater than the frequency of missing a circumstance like this one. There are always difficulties in ML systems like this, my employer has to deal with these things from clients complaining "why didn't the model detect X,Y,Z"! It's because probably sensitivity to detect that would have resulted in excessive other false identifications given the extreme imbalance in label ratios.
 
Last edited:
There have been people who've said their Tesla stopped for a stop sign that didn't exist visually, but must have been in older map data.
There is also the case of it stopping for intersections which are implicit yields. This is common in residential neighborhoods - intersections with no signs, where right of way is just “known,” there’s usually a street that takes precedence. But you don’t have to stop if there is no traffic, even if turning left. You do have to know which street has right of way though.

Not sure how FSD Beta can ever be expected to handle that properly without well-maintained maps (I had some pictures of one of these intersections a while back - here it is - FSD did not stop here, though I suspect it might now - clearly Brookburn does not have right of way ). They could generate the information from drivers at that intersection of course. Wouldn’t be long before they figured out an appropriate approach speed. I don’t see how it is knowable from vision - a human would not know exactly the first time.

Here’s another example. No need to stop here (for either a right or left, though I only ever turn right), just slow to a speed which will allow for a pleasant stop if you do have to stop, then cruise right through if there is no traffic from the left.

FSD tries to stop for this (very very early) and I just push the accelerator. Annoying because it tends to make it turn wide (potentially into traffic).
 
Last edited:
  • Like
Reactions: DrChaos
There is also the case of it stopping for intersections which are implicit yields. This is common in residential neighborhoods - intersections with no signs, where right of way is just “known,” there’s usually a street that takes precedence. But you don’t have to stop if there is no traffic, even if turning left. You do have to know which street has right of way though.

Not sure how FSD Beta can ever be expected to handle that properly without well-maintained maps (I had some pictures of one of these intersections a while back). They could generate the information from drivers at that intersection of course.

Here’s an example. No need to stop here (for either a right or left, though I only ever turn right), just slow to a speed which will allow for a pleasant stop if you do have to stop, then cruise right through if there is no traffic from the left.

FSD tries to stop for this and I just push the accelerator. Annoying because it tends to make it turn wide.
I do recall you discussing those intersections previously. Right now, there's just no way for the neural net to instinctively know if it needs to slow down or not for an unmarked intersection. It must treat each unmarked intersection with caution, slowing or stopping to make sure it can visually observe oncoming traffic, again, without HD maps. With HD maps, it can see what the speed limit of the crossing street and calculate the reaction time it needs to be able to safely handle an oncoming vehicle.

According to CHP traffic rules the car must at least slow down:
  • At intersections without “STOP” or “YIELD” signs, slow down and be ready to stop. Yield to traffic and pedestrians already in the intersection or just entering the intersection. Also, yield to the vehicle or bicycle that arrives first, or to the vehicle or bicycle on your right if it reaches the intersection at the same time as you.
  • At “T” intersections without “STOP” or “YIELD” signs, yield to traffic and pedestrians on the through road. They have the right-of-way.
It may just be extra cautious during Beta, and become more assertive/confident as the NNs are trained.
 
  • Like
Reactions: AlanSubie4Life
These power shifts are highly noticeable. I would point out that speed changes of 2-3 mph are frequently flagged as phantom braking. The speeds here changed by at least 4mph. The speed did indeed change! Humans are amazingly sensitive to jerk.

What is the reason for this behavior? Is there a good reason for it to behave this way?
FSD beta seems to architecturally lack hysteresis — every reaction seems to happen immediately frame-by-frame. There seems to be no smooth averaging over time around a non-emergency control target. I don’t know if this is something they can fix within the existing control framework or if they need to add an additional “chauffeur” layer that is given control targets and then executed them with grace within certain time and location boundaries based upon customer preference settings.
 
  • Like
Reactions: pilotSteve and GSP
At intersections without “STOP” or “YIELD” signs, slow down and be ready to stop.
Agreed on the slowing down, that seems correct. Though in the example above, slowing down on Ashley Falls would be unnatural and potentially put you at risk of being hit from behind if you did so. Certainly caution is warranted, especially if potential crossing traffic is detected.
 
This should not be confusing to a person. It’s not good placement, but seems pretty obvious whose light is whose.

If I came upon it in the evening or night I could easily get it wrong. Or people with imperfect long-distance vision who can't see fine details of the light fixture, which you shouldn't be looking at directly anyway.

It is confusing--something as critical as lights on opposing directions, a major safety issue if you get it wrong, should be automatic neural response, not requiring cognitive analysis. Which is why there are bright colored lights, instead of dim signs or mechanical arrows or something.
 
There are always difficulties in ML systems like this, my employer has to deal with these things from clients complaining "why didn't the model detect X,Y,Z"! It's because probably sensitivity to detect that would have resulted in excessive other false identifications
Humans really have no problem with this though. I hope there is a breakthrough in ML soon beyond “more training.” I’m not confident that more training is the answer in situations like this.

It’s a super easy case, and very very common. FSD should be able to do it perfectly with zero false positives (and no false negatives either!). May take some special sauce. Since we are looking for wide release by end of the year and this is safety critical, I guess they must have a plan to solve this.
 
  • Funny
Reactions: KArnold
It must treat each unmarked intersection with caution, slowing or stopping to make sure it can visually observe oncoming traffic, again, without HD maps.
One improvement would be not slowing too early and confidently rolling to a stop at a location that allows visibility. Even a human completely unfamiliar with the intersection can do this, so it should be reasonable to expect that from a capable system. Right now, even setting aside the default stop behavior, it cannot get the approach to the stop correct. (An example was provided in the video above. Very common.)

Arguably it should be even better in these cases than when a stop line is provided, because those stop lines are often placed too early.