Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSDb Turns Left In Front of Oncoming Car at Last Moment

This site may earn commission on affiliate links.
<iframe width="560" height="315" src="
" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>

So I came across this FSDb video (pretty sure it's 11.4.9) that I thought was particularly worth discussing because the FSDb not only made a dangerous maneuver but there was very little warning to the driver that it would make the move.

The navigation is set to drive straight at the traffic light. At the last moment FSDb throws up a left turn signal and makes a sudden left turn, cutting in front of oncoming traffic. My guess for why it decided to make the left would be that it incorrectly realized it was in a left-turn only lane, so it made the left to follow traffic rules. But that doesn't explain why it failed to yield to the oncoming car. The car does seem to appear in the FSDb visualization. There is a moment where the car might've been occluded by the vehicle directly in front of the FSDb (it's hard to tell because the blind spot camera pops up over the visualization). But even if it couldn't see the oncoming car, it's usually so cautious about turning at intersections, even when there are no other cars around. My guess would be a sort of race condition occurred where it entered the "make left turn" routine in an unusual way (realizing it was in a left turn only lane) that circumvented the "is it clear to pass?" routine.

I've seen some bad FSDb mistakes but this one is particularly discouraging because there was little an attentive driver could have done to avoid the potential accident. Luckily the oncoming car reacted and braked on time.
 
  • Informative
Reactions: Matias
About the reaction time thing, when I have my hands on the wheel I can feel any “uncertainty” of the car nearly instantly, if that makes sense. If the car deviates from the path at all you know right away.

I would have grabbed the wheel tighter the moment I felt the uncertainty, which would have immediately disengaged autopilot as the wheel began to turn to the left.

In this video though, the driver let it get most of the way through the maneuver before taking over. You can see the disengagement doesn’t happen until much later through the turn.

Of course, this is all much easier said after the fact.
 
This is a non-issue. Just keep both hands on the wheel and the disengagement will be nearly instantaneous.

Remember that there is a lot of visual evidence that unmonitored FSD Beta is much less capable than a human driver, and no data at all indicating it is more capable or safer than a human driver.

So it is not surprising (and in fact expected!) that when unmonitored it will have many accidents (obviously it was due to attentive driving on the part of the other driver than no accident occurred here).

This is why use of FSD Beta requires both hands on the wheel and eyes on the road at all times.

I've seen some bad FSDb mistakes but this one is particularly discouraging because there was little an attentive driver could have done to avoid the potential accident.

It really is very easy to disengage when hands are on the wheel at 9 and 3 - there is nowhere for the wheel to go. It would be terrifying for passengers but as a driver you’d never feel like there was any actual risk. There would just be a jerk to the left then you’d disengage.

This also shows the dangers and lack of leverage provided by hands on the wheel at the bottom of the wheel. There’s really a lot less control of the vehicle when starting at that position.
 
Last edited:
  • Like
Reactions: Tronguy and E90alex
Not trying to defend FSD, but my initial thought from the title was his car cut in front and nearly got T-boned.

But in viewing more closely, FSD detected the other car was also turning and just did a bad job of getting around the corner ahead of him.

Still not very defensible.
 
Not trying to defend FSD, but my initial thought from the title was his car cut in front and nearly got T-boned.

But in viewing more closely, FSD detected the other car was also turning and just did a bad job of getting around the corner ahead of him.

Still not very defensible.
Does FSDb predict the path of other vehicles? And how can you tell it detected that the other car was turning? My understanding is that FSDb will not detect turn signals or any other form of intent from other vehicles.
 
About the reaction time thing, when I have my hands on the wheel I can feel any “uncertainty” of the car nearly instantly, if that makes sense. If the car deviates from the path at all you know right away.

I would have grabbed the wheel tighter the moment I felt the uncertainty, which would have immediately disengaged autopilot as the wheel began to turn to the left.

In this video though, the driver let it get most of the way through the maneuver before taking over. You can see the disengagement doesn’t happen until much later through the turn.

Of course, this is all much easier said after the fact.
This is why I ALWAYS hang on to the wheel. I can feel what the car is doing and take over immediately if needed. Tesla also states to keep hands on the wheel.
 
Does FSDb predict the path of other vehicles? And how can you tell it detected that the other car was turning? My understanding is that FSDb will not detect turn signals or any other form of intent from other vehicles.
FSD definitely detects turn signals. When the driver in the lane next to you signals for your lane, FSD will slow to let him in.
 
Good to know that FSDb recognizes turn signals, which is very cool. But even then, I think the oncoming car had right of way, considering it was turning right and it was relatively close to the intersection. It seems unusual that FSDb would pass in front of the car given how timid it usually is at intersections. That leads me to believe it was more of a software error than operation by design. There's also the issue of the car going up the sidewalk, though I'm not sure if that was due to the FSDd or the human driver.
 
This is a non-issue. Just keep both hands on the wheel and the disengagement will be nearly instantaneous.

Remember that there is a lot of visual evidence that unmonitored FSD Beta is much less capable than a human driver, and no data at all indicating it is more capable or safer than a human driver.

So it is not surprising (and in fact expected!) that when unmonitored it will have many accidents (obviously it was due to attentive driving on the part of the other driver than no accident occurred here).

This is why use of FSD Beta requires both hands on the wheel and eyes on the road at all times.



It really is very easy to disengage when hands are on the wheel at 9 and 3 - there is nowhere for the wheel to go. It would be terrifying for passengers but as a driver you’d never feel like there was any actual risk. There would just be a jerk to the left then you’d disengage.

This also shows the dangers and lack of leverage provided by hands on the wheel at the bottom of the wheel. There’s really a lot less control of the vehicle when starting at that position.
I didn't realize until re-watching that his hands might've been off the wheel. 9 o' clock is out of view but his right hand is definitely not at 3. I would like to think that with my hands and feet ready (as I usually am) that I would be able to respond quickly enough, though I am a bit skeptical in this case. It does take a moment to realize FSDb is committing to a maneuver, even if you feel the steering wheel moving. FSDb makes seemingly aimless movements with the steering wheel quite often. It would be easy to dismiss a movement of the wheel as one of its random movements that it will automatically correct.

I'm still very optimistic and excited about FSD, and believe in most cases its errors can be corrected with an attentive driver. But I am now more curious about the possibility of sudden maneuvers that leave no time to react. For example, could it do a sudden jerk of the steering wheel while driving at 70 mph on the highway? I don't think such incidents have been reported? Surely it's physically capable of doing so. The only thing stopping it would be some lines of code.
 
I would like to think that with my hands and feet ready (as I usually am) that I would be able to respond quickly enough, though I am a bit skeptical in this case.
It’s not an issue. You’ll be ready.

Here’s an example I have posted in the past. Not an issue at all, left turn at 25mph.


FSDb makes seemingly aimless movements with the steering wheel quite often. It would be easy to dismiss a movement of the wheel as one of its random movements that it will automatically correct.
Don’t do this. Always Be Disengaging. That way you’ll be ready. Always always disengage when it does something wrong, even if it is super minor. Whether that is slowing down, speeding up, steering incorrectly, changing lanes when you would not, not changing lanes when you would, etc.

If you do this, disengagements will become second nature and routine. The more often you disengage, the better.

It’s very important to be a highly trained safety driver as you supervise FSD. By disengaging and ensuring FSD drives exactly how you want, you will get better at anticipation and also transitioning to manual control, becoming more highly trained. This Is The Way.

Remember that ALL the available observations suggest you are much more likely to be in an accident if you use FSD Beta unsupervised than if you just drive yourself. So supervise it very very closely to increase the likelihood you get a safety benefit. One adequate metric to determine whether you are doing a good job: if you get ANY nags which flash blue or red (other than the ones due to system uncertainty, which are unavoidable), that means you could and should be doing a better job.
Surely it's physically capable of doing so. The only thing stopping it would be some lines of code.
Soon to be replaced by millions of coefficients, none of which are understood.
 
Last edited: