I am not going to agree or disagree on your assessment since I didn't watch the video. I do think it should be apparent to everyone that FSD will not drive like you or I in some situations so we have to be careful to assume it will. I've noticed Waymo also drives differently then I would. Not bad just different and FSD will be the same once it's reaches parity. (hopefully!). Just waiting for the release of the next beta version before watching any new videos. Come on Tesla!!
4:18 - car looks like it’s about to run a red light, forcing Chuck to take over and swerve into the empty right turn. 12:20 - car looks like it’s going to happily run into the road closed sign ahead.
It looks like FSD beta intentionally ran the red light believing it had already crossed the stop line and entered the intersection, so it was continuing through as if it entered on green, which was actually for the right lane but incorrectly associated to this lane. Notice how the predicted path blue line turns gray too early for a line that's for the school sign, so after FSD beta realizing it's going too fast to stop for this "stop line," it decides to get out of the intersection not realizing it's still before the actual stop line.
Here is a classic example that needs work BUT can FSD ever react with the passion and understanding a human can? As we have seen FSD needs to start slowing sooner for pedestrians to instill confidence. Barreling towards them and then hard braking may not hit them but it is confusing to pedestrians. Probably an easy fix. BUT back to this specific situation that is simple to humans. There is a HUGE difference to what we see compared to what FDS sees. Elderly man ahead of wife and committed to crossing and concerned about the speed of car. Woman wanting to cross and CLEALY giving the "are you going to stop and let me cross?" look that we humans understand. However the car only sees a green box that it should not hit but no concern for how close it is and a yellow box that has no relation and has NO empathy for. In all likelihood FSD would have slowed just enough "bullying" the man to "hurriedly" clear the curb by a few feet and then would have continued forward "cutting" the woman off. To the people it would have been an a$$ hole move and Chuck (you and I) would have felt guilty but FSD would not "care".
We almost got some grass cutting action here! This Park St connection is a bit odd as reflected in the navigation thinking the upcoming action is a left turn 1.4 miles away -- i.e., navigation believes you just need to stay on the current street without needing a right turn. At least on OSM, there's an unnamed "tertiary_link" that connects these two segments of Park St where the intersection of the named Park St segments is where the no-right-turn sign is.
@Chazman92 in today's video, you said the car got confused yesterday wanting to stay straight on Herschel instead of turning right on to St Johns. As shown in the screenshot, navigation knew it wanted to stay on Herschel, so that's why it didn't attempt to get into the right lane as it knew it didn't want to be in a right-turn-only lane. But it did get confused by by either one or both of the green right/up arrow as well as the line drawn for the school sign, so you did end up correctly recovering from the car wanting to drive straight through a red light. I believe previous videos, navigation did want to turn on St Johns, so not sure why this time it wanted to navigate straight.
One of the segments of Park Street ( Way: Park Street (682970835) | OpenStreetMap ) was tagged one-way in the wrong direction. Not sure if that was related to the odd navigation choice, but I just set the direction correctly.
I wonder if Tesla should sell FSD (when it is ready) to some cities for public transport purposes first, this way the cities would have skin in the game to evolve the legislation.
Seems like Autopilot currently relies on detecting both a stop sign and a stop line to decide it should stop, and it is able to infer implied stop lines at intersections without one drawn. I guess there just needs to be more training data for snowy 3-way angled intersections for this case. Although this case in particular, one could try to infer a stop line based on the road damage of other vehicles previously stopping there.
Not sure if FSD beta was being overly cautious or this is how people should react when going through this curved intersection without guiding stripes: Here the car slowed down to 24mph instead of up to 45mph because it wasn't sure if the adjacent vehicle would stay in its lane.
Personally, I think that's defensive driving behavior because the adjacent vehicles also slowed down. FSD beta kept a consistent distance behind the Volvo adjacent to it.
Or the half full thinking is the next beta release will show significant improvement. Let's hope that is the case.
Wow. A right turn from a left turn lane into oncoming traffic's left turn lane! I suppose maybe it knew it was turning from the 2nd-from-right lane, so it should end up in the 2nd-from-right destination lane?