Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta 11.3.3 Problems #2

This site may earn commission on affiliate links.

I'm done for the time being with trying out FSD beta 11.3.3, especially after the rather dangerous interaction turning left into two oncoming vehicles. I know others have had a lot of luck in urban environments with this build, particularly with left turns, but my experience is quite the contrary. The software can usually handle left turns at simple, relatively flat, perpendicular intersections for the most part, but it can't cope with the types of intersections in my area.

I'm pretty unhappy with the school zone speed sign problem. This should be a high priority for a company that prides itself on vehicle safety.
 

I'm done for the time being with trying out FSD beta 11.3.3, especially after the rather dangerous interaction turning left into two oncoming vehicles. I know others have had a lot of luck in urban environments with this build, particularly with left turns, but my experience is quite the contrary. The software can usually handle left turns at simple, relatively flat, perpendicular intersections for the most part, but it can't cope with the types of intersections in my area.

I'm pretty unhappy with the school zone speed sign problem. This should be a high priority for a company that prides itself on vehicle safety.
So it doesn't make sense to you to use it in other situations? There is no law that says that you have to use it 100% of the time.
 
  • Like
Reactions: Silicon Desert
Unprotected lefts have always been bad for me as well, it will enter the intersection and then the steering angle will twitch every few seconds from 0 to 30 degrees back to 0 degrees super fast like it wants to go momentarily even when there is a steady stream of oncoming cars with no gaps.
 
Last edited:
I'm done for the time being with trying out FSD beta 11.3.3, especially after the rather dangerous interaction turning left into two oncoming vehicles. I know others have had a lot of luck in urban environments with this build, particularly with left turns, but my experience is quite the contrary. The software can usually handle left turns at simple, relatively flat, perpendicular intersections for the most part, but it can't cope with the types of intersections in my area.

I'm pretty unhappy with the school zone speed sign problem. This should be a high priority for a company that prides itself on vehicle safety.
Agree, unprotected left turns are the main failure with FSD for me If Tesla could clean these turns up it would really help.

Yup school zones and stopped school buses with flashing lights need to be addressed but I'd also like to see No Stop on Red fixed first.
 
  • Like
Reactions: Silicon Desert

I'm done for the time being with trying out FSD beta 11.3.3, especially after the rather dangerous interaction turning left into two oncoming vehicles. I know others have had a lot of luck in urban environments with this build, particularly with left turns, but my experience is quite the contrary. The software can usually handle left turns at simple, relatively flat, perpendicular intersections for the most part, but it can't cope with the types of intersections in my area.

I'm pretty unhappy with the school zone speed sign problem. This should be a high priority for a company that prides itself on vehicle safety.
Just a note, watching your videos is somewhat nerve-wracking but perhaps for a different reason: I disengage FSD beta way before the car gets into such dangerous situations. You are even aware of it with your comments, but don’t actually disengage FSD beta until much later. Disengage the moment it looks hairy. We don’t need to be pushing it so far as to getting into bad situations. It’s known to not be autonomous driving. We are explicitly told to pay attention and be in control at all times. We are told it may do exactly the wrong thing at the wrong time. We should use it like we understand this.

I have been testing FSD beta on most every drive since public access 1.5 years ago (including 3 cross country trips). I only allow the car to get into these situations when no other drivers are around. Some of these situations are only if other drivers are around, but that’s okay. Tesla doesn’t need us crashing to get the data to advance AV solutions. Disengaging because we aren’t sure or are worried the car may do the wrong thing is exactly the data Tesla needs.

As Tesla continues to state, this is an early limited release. As a beta tester, our primary directive is to still operate the vehicle safely. This can be done with FSD beta engaged, but at least in these cases requires earlier disengagements. There are SO MANY edge cases and situations the developers need to address, both safety and convenience/comfort. Let’s advance these ADAS and AV programs by getting them useful data, not clickbait for short sellers, haters, and regulators to find an excuse to stop/slow innovation.
 
Just a note, watching your videos is somewhat nerve-wracking but perhaps for a different reason: I disengage FSD beta way before the car gets into such dangerous situations. You are even aware of it with your comments, but don’t actually disengage FSD beta until much later. Disengage the moment it looks hairy. We don’t need to be pushing it so far as to getting into bad situations. It’s known to not be autonomous driving. We are explicitly told to pay attention and be in control at all times. We are told it may do exactly the wrong thing at the wrong time. We should use it like we understand this.

I have been testing FSD beta on most every drive since public access 1.5 years ago (including 3 cross country trips). I only allow the car to get into these situations when no other drivers are around. Some of these situations are only if other drivers are around, but that’s okay. Tesla doesn’t need us crashing to get the data to advance AV solutions. Disengaging because we aren’t sure or are worried the car may do the wrong thing is exactly the data Tesla needs.

As Tesla continues to state, this is an early limited release. As a beta tester, our primary directive is to still operate the vehicle safely. This can be done with FSD beta engaged, but at least in these cases requires earlier disengagements. There are SO MANY edge cases and situations the developers need to address, both safety and convenience/comfort. Let’s advance these ADAS and AV programs by getting them useful data, not clickbait for short sellers, haters, and regulators to find an excuse to stop/slow innovation.
I agree with the idea of not running FSD beta in heavy traffic conditions. Its decision process seems to break down, which may be a sign that there isn’t sufficient computing power available. Some cases without heavy traffic are still beyond its capabilities. I’ll still do some testing as I get future versions. I see too many others running successful tests & leaving the impression that FSD beta is further along than it really is.

I have definitely found the edges of its current capabilities.
 
  • Like
Reactions: timberlights

I'm done for the time being with trying out FSD beta 11.3.3, especially after the rather dangerous interaction turning left into two oncoming vehicles. I know others have had a lot of luck in urban environments with this build, particularly with left turns, but my experience is quite the contrary. The software can usually handle left turns at simple, relatively flat, perpendicular intersections for the most part, but it can't cope with the types of intersections in my area.

I'm pretty unhappy with the school zone speed sign problem. This should be a high priority for a company that prides itself on vehicle safety.
I am not sure why you are driving this way. You are the driver and the car must never do anything you would not do.

This often means disengaging well in advance of any bad behavior. (With some experience.)

It’s of zero benefit and in fact harmful to let the system try.

Just disengage. No need to worry about reporting it.

Every time the steering wheel jerks, be sure to disengage (hold it firmly enough to be sure this happens). This is crucial for improving FSD. (Yes, this means that no unprotected left is likely to be completed - but that is correct, and will rapidly advance development). Do not permit the system to turn the wheels in advance of a turn. This is unsafe.

Obviously if there is zero traffic you can be more lax and familiarize yourself with its failings (recommended before using with traffic!), but with other vehicles around this is the correct standard.

Do not let the car deviate from the exact path you would take. This includes entering turn lanes. Just disengage!!!
 
Last edited:
  • Like
Reactions: KArnold and TresLA
Whatever you do, every time it makes one of those failures, regardless of whether it’s a small one or a really dangerous one, like trying to turn into traffic on you, make sure you press the icon on the screen that will report it directly to Tesla. On my model Y, it’s the center screen, and the car icon that brings up the settings screens. If you press and hold that it will send that information to Tesla so that they can review it.

that being said, I’ve had nothing but problems with this version of FSDB… specifically the automatic lane change that can’t be turned off. I find that the car very frequently tries to change into lanes that aren’t actually lanes, or attempt to pass traffic on the right hand side.

I drove home yesterday from Fairfax Virginia, and it was totally confused by the I–66 express lanes. It can’t keep track of the fact that the speed limit on those express lanes is 70 miles an hour, it reads the speed limit signs for the local lanes and continues to flip up-and-down between speed limits, he changes lanes for no reason whatsoever, and those so seemingly randomly.
 
I agree with the idea of not running FSD beta in heavy traffic conditions. Its decision process seems to break down, which may be a sign that there isn’t sufficient computing power available. Some cases without heavy traffic are still beyond its capabilities. I’ll still do some testing as I get future versions. I see too many others running successful tests & leaving the impression that FSD beta is further along than it really is.

I have definitely found the edges of its current capabilities.
As has always been the case with autopilot and now FSD beta, everyone’s experience is different. I run FSDb all the time, heavy traffic, now on highways too, etc. I just have a much lower threshold before disengaging when in heavier traffic. When no cars are around, I’m curious and more willing to let it struggle and see whether it can figure it out. For example, a couple of the 4-way stop sign intersections on a typical route home for me seem strangely difficult where it’ll stop and not proceed, or take a very long time to crawl slowly through the intersection before speeding up to a normal/natural speed. When no one is around (including pedestrians), I let it take its time because I’m trying to understand what’s causing the issue. But at the same time I’ve taken many unprotected left turns through heavy traffic and it does great. It’s also not consistent (the nature of AI/ML?), so I test it at the same intersection and route multiple times before deciding whether I think it’s regressed.
 
Just a note, watching your videos is somewhat nerve-wracking but perhaps for a different reason: I disengage FSD beta way before the car gets into such dangerous situations. You are even aware of it with your comments, but don’t actually disengage FSD beta until much later. Disengage the moment it looks hairy. We don’t need to be pushing it so far as to getting into bad situations. It’s known to not be autonomous driving. We are explicitly told to pay attention and be in control at all times. We are told it may do exactly the wrong thing at the wrong time. We should use it like we understand this.

I have been testing FSD beta on most every drive since public access 1.5 years ago (including 3 cross country trips). I only allow the car to get into these situations when no other drivers are around. Some of these situations are only if other drivers are around, but that’s okay. Tesla doesn’t need us crashing to get the data to advance AV solutions. Disengaging because we aren’t sure or are worried the car may do the wrong thing is exactly the data Tesla needs.

As Tesla continues to state, this is an early limited release. As a beta tester, our primary directive is to still operate the vehicle safely. This can be done with FSD beta engaged, but at least in these cases requires earlier disengagements. There are SO MANY edge cases and situations the developers need to address, both safety and convenience/comfort. Let’s advance these ADAS and AV programs by getting them useful data, not clickbait for short sellers, haters, and regulators to find an excuse to stop/slow innovation.

Will try not to give you too hard a time here...but(haha), In that scenario I think you should have been a lot more cautious and reserved on how far you were going to allow the car to go.

That being said, that was a tough scenario for the car, and for anyone who REALLY WANTS to make that left turn yield on green but both of those thinking scenarios(car vs human) are WAY different I think. Obviously I am making my own assumption on the way the car is thinking about that scenario... The car is looking at that scenario and ONLY working with the data is CURRENTLY sees. The human is seeing that same data but the human KNOWS that in that specific scenario, once they make the decision to go they HAVE to >99% commit to that turn and make the turn aggressively. The car does not realize in that scenario that it needs to make that turn aggressively. I think that it COULD do that with some better speed/distance analysis for that given scenario.

One scenario shouldn't make you give up using FSDb however, just don't let it do that scenario again till the next update. And when you try again, please be a little more cautious of how far you let the car take things.

Every time the steering wheel jerks, be sure to disengage

Yea, when the steering wheel jerks it tends to be a big clue that the car might not be ready to do the action. I disengage 99% of the time it starts jerking too, or I am at least more prepared to abort the maneuver before getting into a sticky situation.
 
Just disengage. No need to worry about reporting it.

Every time the steering wheel jerks, be sure to disengage (hold it firmly enough to be sure this happens). This is crucial for improving FSD. (Yes, this means that no unprotected left is likely to be completed - but that is correct, and will rapidly advance development). Do not permit the system to turn the wheels in advance of a turn. This is unsafe.

Obviously if there is zero traffic you can be more lax and familiarize yourself with its failings (recommended before using with traffic!), but with other vehicles around this is the correct standard.

Do not let the car deviate from the exact path you would take. This includes entering turn lanes. Just disengage!!!

I tend to disagree.

Just because the wheel jerks, doesn't mean that FSD should be disabled. There are many times, when at extremely slow speeds the wheel is making some corrections and has some large excursion, but gets the job done correctly. And in most of these cases, if you are going a little faster, the jerkiness does not occur.

"deviate from the exact path that you would take" I absolutely disagree. Watch an intersection for a period of time. How many people turning left takes the same path? Some swing wide, some swing short. Some go to the left lane, some go to the right. Some pull out into the intersection and wait for a clearing, some stay back.

There are too many acceptable paths to require the car to follow the "exact path" that you would.
 
  • Like
Reactions: Silicon Desert
You’ll know it when it happens.

When using FSD, steer the car as you would. If it disengages it was off the path too far.

The important thing is to be actively steering at all times according to the correct path.

Sorry I have to disagree with you partly here...just you cause it to kick out of AP because it doesn't steer the path you would have doesn't mean the path it chooses is "out of spec" per se. Nor does it mean that if you let it continue that it would cause an unsafe situation. As long as it follows a safe and reasonable path for the scenario then that is a good thing.

I am curious as to whether the car offsetting lane position because of a large truck would trigger your threshold. I'll have to try that one.

The other thing is that the amount of force required to kick out of AP is not static, nor do I really believe it to be predictable.