Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Tesla didn’t send a driver to Chuck’s turn. No way.

I agree 10.12.1 did very poorly for Chuck, however I think it was mostly due to bugs:

1. The “take over immediately” issue Chuck hit every time around the block is supposedly already fixed in 10.12.2.

2. The “always turn right” bug is something he experienced on previous versions, was fixed, and has obviously returned. Unknown if that is fixed in 10.12.2.

But as Elon said, there will be a few steps back in this version until they get it ironed out.
 
Yea, if Tesla sent an engineer from Calf to Fla to test and gather data for an improvement then PLAESE don't send any to ATL since Beta and I have enough issues without those "improvements". 🤣

EDIT: I believe I have watched all his "left turn" and even the videos that evolved into to the left turn videos and I think this is the first 100% failure rate. What a terrible job 10.12.1 Beta did. Several EASY opportunities to make the left and it was a complete bail out to an infinite loop. Take that back since it couldn't even infinite loop with that STRANG repeatable Take Over Immediately bug.o_O


Pretty sure the .2 point release was to address the FSD freakout bug. Anyone notice how the pathing line gets shorter and starts flickering seconds before the failure?

The big hype for this release for months has been improved ULT behavior, and the one tester that focuses on ULTs had an abysmal run :D Face, meet egg.

Hopefully whatever was causing the issues gets fixed before this goes to the safety-score cohort.
 
Someone commented on Chuck's video speculating the bug is tied to the system looping back on its previous nav route and then crashing hard when it reaches the spot where FSD was engaged, which seems quite plausible and aligns with what Chuck experienced

I wouldn't expect a .x.x release to fix something like the behavior of bailing to the right but who knows
 
Someone commented on Chuck's video speculating the bug is tied to the system looping back on its previous nav route and then crashing hard when it reaches the spot where FSD was engaged, which seems quite plausible and aligns with what Chuck experienced

I wouldn't expect a .x.x release to fix something like the behavior of bailing to the right but who knows
He has had the same looping back behavior on previous Betas without any "crash". Also that is not the point at which he engaged Beta every time. The first time he turns Beta on he is not even on the road the "crash" happens on.

Previously Betas it would not loop back every time like it did in this video (sans the one time it had a lead car and tried to follow it into the median). It would often make the left turn successfully.
 
Last edited:
He has had the same looping back behavior on previous Betas without any "crash". Also that is not the point at which he engaged Beta every time. The first time he turns Beta on he is not even on the road the "crash" happens on.

Previously Betas it would not loop back every time like it did in this video (sans the one time it had a lead car and tried to follow it into the median). It would often make the left turn successfully.
Good call! I read the comment last night and re-watched all of the runs to see if it was accurate and it seemed like it was, but I didn't re-watch where he started the first run -- he first engaged while already driving towards the unprotected left. So that seems to debunk that theory

Yeah I've been watching Chuck's videos from day 1, we hadn't seen this bail-to-the-right behavior in a while
 
Yea, I think the "crash" was, like others have speculated, the main problem with 10.12.1. Dirty Tesla is also having it.

Screen Shot 2022-05-28 at 11.08.12 AM.png
 
Of note, it seems like Tesla has finally turned off the flag that had the car treating yield signs at roundabouts as stop signs. There was an earlier vid that showed perfect yield behavior at a roundabout, and in DT’s newest vid his car handled two roundabouts perfectly as well. Now to see if there are instances where the car fails to yield to oncoming cars in the roundabout, necessitating a quick brake from the driver.
 
  • Like
Reactions: Iain
Of note, it seems like Tesla has finally turned off the flag that had the car treating yield signs at roundabouts as stop signs. There was an earlier vid that showed perfect yield behavior at a roundabout, and in DT’s newest vid his car handled two roundabouts perfectly as well. Now to see if there are instances where the car fails to yield to oncoming cars in the roundabout, necessitating a quick brake from the driver.
I’m not convinced FSD handles this properly even now. I’ve had a couple of instances when FSD wanted to get into the roundabout after stopping even though a car was coming from the left.
 
Haha, I was kidding. If it's busy enough, certainly a right would be advisable and often faster, but in a Tesla, there were plenty of opportunities. 0-60 in 3.1 seconds, man. If you're not using it, you're not living. But in this case it was possible even in a Corolla.

Like this opportunity...:

The other fix you propose (and support for u-turns) are just Band-Aids for a system with incomplete function (at the current time).
LiDAR is the bandaid. Not this.

UPS also does this “bandaid” or so, I’ve read.

A feature where the car takes a safer option after evaluating risk is the correct thing to do. As I keep noting, @diplomat33 ’s favorite company would do this as well.

As a wise person once noted Band-Aids are actually very useful and have been used for decades ;)
 
  • Funny
Reactions: AlanSubie4Life
As I keep noting, @diplomat33 ’s favorite company would do this as well.

I have no problem with AVs rerouting to avoid risky or challenging routes, especially when they have customers onboard. It is much better to play it safe than to risk injury or worse to the passengers if an accident were to occur. So, if Tesla launches a driverless robotaxi service, like Waymo has, and they decide to reroute in some cases, to minimize risk to the passengers, I would be completely fine with that. But is the AV rerouting to minimize risk to the passengers or is the AV rerouting because it is not capable of handling the route reliably? That's a big difference. In the case of Waymo, it has very capable FSD that can handle unprotected left turns, it just reroutes in some instances when carrying passengers. And my guess is the 5th Gen I-Pace probably reroutes less than the 4th Gen Pacificas in Chandler. In the case of Tesla, I am not convinced that the current hardware/software is good enough to handle unprotected left turns reliably. I suspect part of the reason that FSD Beta reroutes in Chuck's videos is because the perception, prediction and planning stacks lack confidence due to not being able to reliably track fast moving traffic at long range from the sides. If Tesla's FSD is avoiding unprotected left turns because the hardware/software are not good enough yet, that's very different from Waymo.
 
Last edited:
A feature where the car takes a safer option after evaluating risk is the correct thing to do

My point was that is not what it is doing, pretty clearly. Obviously, at a minimum, the risk assessment is wildly wrong (it may not even be making a risk assessment and may be deciding that it is not possible to turn left - would have to have one successful left turn with no traffic to disprove that). Compounded by inability to reroute.

I suspect part of the reason that FSD Beta reroutes in Chuck's videos is because the perception, prediction and planning stacks lack confidence due to not being able to reliably track fast moving traffic at long range from the sides.

Yes, but not fundamental limitations on perception, prediction, and planning - yet. Some of those things may at some point become a limitation which make appropriate safety levels impossible, but in this case I think it’s happening because the software is not appropriately coded. It’s a bug!

Some people here are pushing to have the Tesla make right turns to avoid this bug. I, on the other hand, am pushing to have Tesla fix this limitation (bug), and make the performance on UPLs as good as theoretically possible given the sensor suite and HW3 compute.

I’m also strongly opposed to routing decisions which strongly prefer avoiding UPLs and routing to intersections with traffic lights, as has been suggested, a preference which will put Tesla at a tremendous disadvantage and lead to nonsensical routing. I want it to do this only when it’s sensible and what a human would do.

Sounds Iike currently no point in updating software in any case. I will wait.
 
Last edited:
My point was that is not what it is doing, pretty clearly. Obviously, at a minimum, the risk assessment is wildly wrong (it may not even be making a risk assessment and may be deciding that it is not possible to turn left - would have to have one successful left turn with no traffic to disprove that). Compounded by inability to reroute.



Yes, but not fundamental limitations on perception, prediction, and planning - yet. Some of those things may at some point become a limitation which make appropriate safety levels impossible, but in this case I think it’s happening because the software is not appropriately coded. It’s a bug!

Some people here are pushing to have the Tesla make right turns to avoid this bug. I, on the other hand, am pushing to have Tesla fix this limitation (bug), and make the performance on UPLs as good as theoretically possible given the sensor suite and HW3 compute.

I’m also strongly opposed to routing decisions which strongly prefer avoiding UPLs and routing to intersections with traffic lights, as has been suggested, a preference which will put Tesla at a tremendous disadvantage and lead to nonsensical routing. I want it to do this only when it’s sensible and what a human would do.

Sounds Iike currently no point in updating software in any case. I will wait.
It’s seems like it’s correctly coded. It just isn’t confident enough in it’s perception so it takes the safer choice.

Confidence_of_peception < confidence_level_required at this stage of the beta

The confidence_level_required can be manually decreased as Tesla has more data and the perception can improved with more data.

Tesla has added a lot of complexity to its code with this update, so some unexpected bugs are to be expected. Many of these will now be found by the testers and be fixed in upcoming releases. Tesla takes the iterative approach with trial and error.
 
My point was that is not what it is doing, pretty clearly. Obviously, at a minimum, the risk assessment is wildly wrong (it may not even be making a risk assessment and may be deciding that it is not possible to turn left - would have to have one successful left turn with no traffic to disprove that). Compounded by inability to reroute.
We’ll, risk assessment needs to be done by FSD, according to its perception and planning abilities. Not according to ours,

Let me put it this way, I prefer bailing to trying ULT only to get stuck forcing safety intervention, like it has done so many times in the past at that location.
 
I have no problem with AVs rerouting to avoid risky or challenging routes, especially when they have customers onboard. It is much better to play it safe than to risk injury or worse to the passengers if an accident were to occur......
That is fine for the nicer burbs and smaller cities....but the problem is living in the middle of a major urban city. It is challenging, risky with dynamic patters (construction, detours, traffic) in every route and direction.
 
That is fine for the nicer burbs and smaller cities....but the problem is living in the middle of a major urban city. It is challenging, risky with changing patters (construction, detours, traffic) in every route and direction.

True and that is why AV companies are so focused on solving dense urban driving because they know they have to be able to handle it safely if they want to deploy robotaxis in urban areas. Rerouting is ok in some cases, maybe necessary sometimes, but you will never have a successful robotaxi service if you always avoid all ULTs for example. So ultimately, your AV does not need to be able to safely handle as many risky and challenging cases as possible. All I am saying is that some rerouting is sometimes ok. After all, even human drivers sometimes reroute to avoid a dangerous situation. There are cases that even good human drivers avoid for safety reasons. There are rare but they do exist.
 
  • Like
Reactions: Terminator857
When the planner moves to NN ?

People point out various cases where the planner is not obviously optimal. If Tesla were the "hacking" kind, they could fix those specific cases - but that would be foolish. They need to keep working on the generic solution that optimizes and finally "fixes" all these cases.
Why would the planner NN fix it? Because it works for humans? Does that follow?