Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I’ve found that FSD Beta (under 2022.44.30.5) cannot negotiate a particular intersection geometry shown below. The vehicle just stops before the intersection and will not proceed.

Here’s the setup: the vehicle is in blue with the direction of motion shown by a black arrow. The planned path is shown in green. The intersection has traffic lights that I believe the vehicle sees. The vehicle does not pull up to the stop line, instead it stops in the middle of the curve and will not proceed. I’ve encountered two instances of this intersection geometry in my area, and in both cases the vehicle behaves the same. I have to manually drive the vehicle thru the right turn, then re-engage FSD.

1673641711580.png



I’ve also noted that the vehicle under FSD beta doesn’t do double-left-turn lanes at an intersection very well. It fails to anticipate which of the two left turn lanes it should use for its near-future planned path, and it will slowly maneuver between the two left turn lanes when there are no other vehicles around. I have to take over and re-position the vehicle in the correct lane for the future path.

Yet another problem: FSD beta will not use a particular freeway exit even though it is in the planned route. This exit is at the intersection of I-5 and I-405 south of Seattle in the Tukwila area. The particular exit is the transition between “southbound” I-405 and southbound I-5. It’s a 270 degree transition between freeways. The vehicle, in the right lane, consistently drives right past the exit onto SR-518 (beyond the end of I-405), then slows way down and impedes traffic behind. I have to drop out of FSD and speed up while Nav finds an alternate route. Sometimes, the vehicle will change lanes into the left lane because of traffic before the interchange and fails to change lanes back into the right lane in time to make the exit.
 
  • Like
Reactions: bjhinkle
Be careful with that. I tried that experiment with my trouble point from above and let the software figure its way out from an obvious problem it put itself into. At the very last moment at the intersection it wildly steered and disconnected at the same time, and even though I was prepared to react, I barely missed hitting the curb. I gave up experimenting after that, and as soon as I see it is not doing the right thing, I correct it myself. On intersections, I always look for incoming traffic myself and press on the accelerator so it doesn't have to be hesitant, and I move to an appropriate lane if I it is not doing it in time. Maybe I won't get a medal from Tesla's beta testing team, but with just a little effort in a few known situations, FSD beta becomes a much better experience. Now, if only the trust were mutual, and it stopped asking me to nudge the wheel...

Absolutely! Whenever I am trying to figure out a situation like this and fully classify the failure I am VERY careful. I also do a lot of thinking about various failure scenarios and what the worst thing could happen is and plan how to abort before those can happen. If there isn't enough time to react and abort for any given safety issue, then I won't do it. In this specific case the full failure point could result in going into the grass and hitting some signs with no curbs present. The worst case(given no traffic which would be the only way I would test this scenario fully) is a sudden weird swerve into a telephone pole which is highly unlikely. All these issues can be adequately mitigated due to reflex time vs speed/distance availability.

Testing with traffic involved is VERY different than testing with no other vehicles involved also...I don't even let it do a left or right turn (at all)when it is a double turn lane and there is another vehicle in the other lane...
 
FSDb is also often really bad at speed decisions, resulting in passenger discomfort and missed turns.

In particular:
  • It accelerates up to full speed even when there's a red light ahead with a line of cars, then brakes way too late to recover all that wasted power with regen.

Yeah, it does seem when there is a red light AND a vehicle stopped in front of you that the car *seems* to ignore the red light. I think the failure is the stopping algorithm when coming up to a stopped vehicle. I have seen that just stopping for a red light with no cars it does it very smoothly and with plenty of time.
 
  • Like
Reactions: timberlights
Yeah, it does seem when there is a red light AND a vehicle stopped in front of you that the car *seems* to ignore the red light. I think the failure is the stopping algorithm when coming up to a stopped vehicle. I have seen that just stopping for a red light with no cars it does it very smoothly and with plenty of time.
It isn't consistent. Sometimes it slows down in plenty of time when there are stopped cars, and sometimes it doesn't.

In much the same way, sometimes it slows down for the speed bumps in my neighborhood, and sometimes it tries to break an axle.
 
  • Like
Reactions: timberlights
I think it's incredible! Does it mess up sometimes? Absolutely. But quite a bit of the time it's the most incredible tech I own.
Can't it be both incredible and not worth it at the same time? It is incredible in that I see the potential for it being good eventually, and it does a lot of stuff really well, like spotting pedestrians, garbage cans, traffic lights, stop signs, etc. Now if only it could make turns without it feeling like someone on drugs jerking the wheel back and forth to the point where I take over more than half the time. 😁
 
  • Like
Reactions: John Archuleta
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
 
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
So, what happened?
 
Has anyone else experienced situations where Full Self Driving software performed maneuvers that caused damages or injuries, with no time for driver to humanly intervene? It happened to me. Tesla drivers need to be aware that drivers assume all financial liability, even if the damages were caused by the Full Self Driving software, sensors or cameras. You are personally risking damages, injuries and financial liability to test out their software and driving systems. That is not stated in any disclaimer by Tesla. The software and sensors are not very reliable and to claim otherwise is false advertising that encourages false expectation. I bet far fewer people would use FSD or consider it a worthwhile feature if they knew they're financially responsible for everything the software does.
Had you searched around a bit here before posting, you might have noticed that users of this forum regularly discuss the liability impacts of various SAE ADAS levels and that discussion of the risks associated with FSD and FSDb is prevalent.

It is very rare to see a posting from someone claiming to have actually had an accident while using FSD or FSDb. Perhaps you could provide some details of your experience. Posting dashcam video would be rather useful since some may express a bit of skepticism regarding your claim, especially as you are a new poster here.
 
Had you searched around a bit here before posting, you might have noticed that users of this forum regularly discuss the liability impacts of various SAE ADAS levels and that discussion of the risks associated with FSD and FSDb is prevalent.

It is very rare to see a posting from someone claiming to have actually had an accident while using FSD or FSDb. Perhaps you could provide some details of your experience. Posting dashcam video would be rather useful since some may express a bit of skepticism regarding your claim, especially as you are a new poster here.
I don't think that poster has any intention to provide real details. They posted the exact same post to three different threads. They're looking for something. Maybe it's a news reporter looking to score ... we'll never know.
 
  • Like
Reactions: JB47394