Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Good question. I don't remember any tester commenting on that but perhaps someone else will. With regard to another type of sign it appears there has been a recent change where FSD beta knows from signs whether it can go right on red or whether a sign restricts that. I believe it was Dirty Tesla that mentioned that in a recent video. If that's true it would bode well for reading school zone time restriction signs.

Dirty Tesla speculated about right on red, but its possible that data came from the map (or both). My guesses are as good or bad as anyone else, but I'm assuming Tesla get data from the map as well as the cameras. From my somewhat outdated knowledge of AI the map data is possibly used as hints to the AI (such as "expect a right on red restriction here" type of thing). And for clarity, I'm not talking the (much confused) idea of an HD map here, just the basic info available from the nav maps.
 
(0:16 to 0:21)
Yeah, I'm sorry I just don't understand this behaviour. The car is supposed to be turning hard right.
Watch the path prediction. It's going left, straight, right, left, right...
Also watch the wheel. It's actually turning left, right, left, right WHILE it's making the hard right turn.
How can it not know if it's supposed to turn left or hard right WHILE turning, it keeps changing its mind.
So forget what the steering wheel did. What grade would you give FSD for this sharp right turn?
 
.
(0:16 to 0:21)
Yeah, I'm sorry I just don't understand this behaviour. The car is supposed to be turning hard right.
Watch the path prediction. It's going left, straight, right, left, right...
Also watch the wheel. It's actually turning left, right, left, right WHILE it's making the hard right turn.
How can it not know if it's supposed to turn left or hard right WHILE turning, it keeps changing its mind.

It knows from the map data it needs to take a hairpin right, but the vision system isn't able to confirm if that path is available yet. It may be planning contingency routes in case it is eventually unable to go right at the tee. It can see that it can go left.
 
.


It knows from the map data it needs to take a hairpin right, but the vision system isn't able to confirm if that path is available yet. It may be planning contingency routes in case it is eventually unable to go right at the tee. It can see that it can go left.
This is a good explanation and the way that I've always ASSUMED it was working. It's basically a "trust but verify" system.
 
I speculate Whole Mars FSD beta got into an accident, so he's not been posting for a while :confused:
He is actually being sued by an deranged TSLAQ a*hole (for legal purposes: these are terms of endearment use in my country of birth :rolleyes: )

Specifically, he is being sued by a very special kind of a*hole - the one that created a fake charity to collect money from journalists and the like to smear Tesla.
If you actually want to know more info on the $h!t that Omar is going through with this special "human being" that is suing him, read further here: How Aaron Greenspan’s Charity PlainSite Silences Critics by Cyberstalking
 
So forget what the steering wheel did. What grade would you give FSD for this sharp right turn?

So you're a driving test examiner. You tell me to turn right at the stop sign.
I look left and right and evaluate my contingency routes. Fine, we're all gonna do that subconsciously. Somewhat. That's part of drawing the lines of the road.

But what FSD is doing is plotting a path and actually turning the steering wheel as it flits between dozens of path choices.

If I turn the wheel to the left while planning to go left, then turn right while planning to go right, several times in the same second. Well that's an awkward way for a person to be driving. That's undue hesitation, and yes you would fail.

eg.: "Undue Hesitation: Due to the candidate not applying enough observation and planning in advance, they’ll arrive at the junction and decide what to do once they are there"
 
#FSDBeta 7 -2020.48.10.1 - Turn Near Large Trucks and a Turn Signal Issue in this build

Thanks for posting that Chuck.

Another example of risky instant course correction. The path flits to the right into the path of the truck beside him. The steering wheel follows causing Chuck to make a disengagement. It happens too quickly to be hands-off.
Correction.jpg
 
So you're a driving test examiner. You tell me to turn right at the stop sign.
I look left and right and evaluate my contingency routes. Fine, we're all gonna do that subconsciously. Somewhat. That's part of drawing the lines of the road.

But what FSD is doing is plotting a path and actually turning the steering wheel as it flits between dozens of path choices.

If I turn the wheel to the left while planning to go left, then turn right while planning to go right, several times in the same second. Well that's an awkward way for a person to be driving. That's undue hesitation, and yes you would fail.

eg.: "Undue Hesitation: Due to the candidate not applying enough observation and planning in advance, they’ll arrive at the junction and decide what to do once they are there"
I think you missed the point of the exercise. Just look out the window and grade how well the car took the turn. The driver certainly seemed very pleased and didn't think the car failed. I'll give it a B, certainly not an F.
 
I think you missed the point of the exercise. Just look out the window and grade how well the car took the turn. The driver certainly seemed very pleased and didn't think the car failed. I'll give it a B, certainly not an F.
I didn't grade the turn, just the behaviour. If it makes the turn without disengagement nor hitting anything then fine it's a B.

#FSDBeta 7 -2020.48.10.1 - Turn Near Large Trucks and a Turn Signal Issue in this build
The turn in Chuck's video is an F
 
So you're a driving test examiner. You tell me to turn right at the stop sign.
I look left and right and evaluate my contingency routes. Fine, we're all gonna do that subconsciously. Somewhat. That's part of drawing the lines of the road.

But what FSD is doing is plotting a path and actually turning the steering wheel as it flits between dozens of path choices.

If I turn the wheel to the left while planning to go left, then turn right while planning to go right, several times in the same second. Well that's an awkward way for a person to be driving. That's undue hesitation, and yes you would fail.

eg.: "Undue Hesitation: Due to the candidate not applying enough observation and planning in advance, they’ll arrive at the junction and decide what to do once they are there"

Tough crowd. Your examiner might think it was great you were being prudent before committing to a blind turn at the peak of your path. There could be a washout just around the corner just past the peak. If you had your wheels hard right and are now pointing at and close to the washout, what can you do? Fsd can't back up yet. By turning the wheels hard left, you are still going to advance toward the washout for awhile, so you might be trapped there. There could be other obstructions as well, or there could be no road there at all.

The car doesn't repeatedly turn the steering wheel to the left and then the right. It straightens the wheel once a little (less of a right turn) as it starts to advance until it has more confidence in the route to the right. Then it commits.

The paths change while he waits his turn as the car to the right approaches and travels through the intersection.
 
Tough crowd. Your examiner might think it was great you were being prudent before committing to a blind turn at the peak of your path. There could be a washout just around the corner just past the peak. If you had your wheels hard right and are now pointing at and close to the washout, what can you do? Fsd can't back up yet. By turning the wheels hard left, you are still going to advance toward the washout for awhile, so you might be trapped there. There could be other obstructions as well, or there could be no road there at all.

The car doesn't repeatedly turn the steering wheel to the left and then the right. It straightens the wheel once a little (less of a right turn) as it starts to advance until it has more confidence in the route to the right. Then it commits.

The paths change while he waits his turn as the car to the right approaches and travels through the intersection.

Your hypotheticals are not relevant as that's not what happened in this case.

The path mechanism of the last few videos I have commented on shows that it's making snap decisions that vary several times per second. Worse it's taking action on these changes. The only necessary path was right, there was no need to take a left path. If that was part of the decision process fine, but the wheel is following the path choices.

It's clear that we are making different conclusions based on the same video, and that's fine by me. We're all entitled to our opinion, only a programmer/analyst from Tesla can give the correct answer.
 
It's clear that we are making different conclusions based on the same video
This seems to happen frequently here. You seem entrenched and determined to find fault with fsd even at the most trivial level, steering wheel moves too much. I'm certain I've already exceeded GlmnAlyAirCar's limit on commentary, so, OK.