Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
It probably includes both, but does it really matter?
It matters - in the sense, # of external beta testers gives us an idea of how confident Tesla is and how far they are from GA. Tesla has never had - and will never have any problem getting data. Its what to do with the data and how fast that can be incorporated into test data and improve FSD - thats been the issue.
 
  • Like
Reactions: 82bert
Wow, only 1 disengagement in 22 miles because of an edge case. Also, it was impressive that it recognized the gate.

That's not an edge case. That's a scenario you see dozens of times in a single urban drive.
After so called "15 billion miles of data advantage". This is the result? Failing on the simplest scenario? Pathetic!
 
Interesting situation where FSD beta initially wants to go around the biker then decides to follow:

around.jpg

follow.jpg

Unclear if it's traditional code making the decision to not select a path around the biker because an intersection has been detected or if the neural network is the one actively deciding that the biker should be followed given the rest of what it sees.
 
  • Informative
Reactions: StefanSarzio
FSD Beta 8.1 - 2020.48.35.7 - Downtown Chicago
Some things I found:

1:40 Disengagement, failure to get into correct lane
2:05 Intervention to get through yellow light
2:25 Failure to use turn signal while waiting to turn
3:10 Disengagement, too close to the curb
3:45 Weaving
4:15 Allowing another car to merge in (good)
6:15 Disengagement. Missed the left turn entirely
6:25 Path prediction (while disengaged) still wanted to go desperately left with no possible lane
6:45 Large phantom object on right
6:57 Large phantom object ahead
7:40 Disengagement, turning when prohibited. Creeping directly into traffic
8:15 Intervention, manual reversing to clear crosswalk
9:15 Disengagement, failure to get into correct lane
9:25-10:05 Poor path prediction in left-turn-only lane, swinging from left(correct) to straight to right(no!)
10:12 Intervention to accelerate
11:00 Intervention, failure to obey hand signals of flagger
11:45 Phantom object directly ahead

Total distance 1.3 miles
Disengagements: 5
Interventions: 4
 
9:25-10:05 Poor path prediction in left-turn-only lane, swinging from left(correct) to straight to right(no!)

I wonder if the "Path Prediction" is not simply the direction the steering wheel is trying to point?
I notice that the steering wheel jerks instantly to match the direction of the path. So if the car needs to nudge right, it draws the path right. Perhaps it does not mean that it intends to actually turn right?
Watch from 9:53 to 10:10, especially around 10:03-10:08
https://www.youtube.com/watch?t=564&v=4_dxHWopaFo

Could it be that the path prediction is indicating instantaneous wheel direction, and yet somehow also saying it still intends to turn correctly? Otherwise I don't see how it should be making so many alternate paths while trying to turn left.
 
  • Helpful
Reactions: mhan00
Could it be that the path prediction is indicating instantaneous wheel direction, and yet somehow also saying it still intends to turn correctly?
Indeed quite odd that a path would even go to the right for a left turn. The path can briefly swing the opposite direction initially but still make a correct path to the desired direction overall, e.g., to make a wider turn, so something else must have been confused in this case:

wabash 3 right.jpg


My guess is the neural network predicts how far it has traversed in the navigation, and briefly it believed it hadn't made the right turn, which is actually the current navigation step shown on the right. But most of the time it was jumping between left and straight probably because snow confused the road edge detection making it believe there was no left turn destination, so it defaulted to going straight in those cases.
 
Not to be rude, but who cares how the revisions are numbered? I'm sure the change makes sense to the company.

Dan

Not to be rude, but apparently making inferences is hard.

I was referring to the fact that we are now getting releases in decimal points. By opening the door with 8.1, the implication is that each "revision" can have up to 9 "subrevisions". This means we thought we were 2 updates from a wider release, but we may actually be 18 updates away. The post was nothing to do with the revision number being rolled back to 8.

Of course, that's if the next update isn't something asinine like 8.1.1
 
Any suggestions to why the neural network predicted this center turn lane vehicle to be the lead vehicle (green)?

center turn.jpg

That other vehicle seems to have been fully in the center turn lane for at least 5 seconds yet FSD beta decided to slow down to 15mph with others probably expecting to go 35mph+. I suppose there is potential for that vehicle to abort its left turn and suddenly swerve back into the travel lane, and maybe the neural network doesn't have enough history to remember that vehicle had just exited in the first place?
 
  • Informative
Reactions: pilotSteve
Let me know what you think of “Brandon’s layout”. The screen camera needs more work.
A larger area for the screen visualization would make it easier to tell what FSD beta was thinking for the oncoming vehicle when trying to pass:
pass double.jpg

This screenshot was captured from the 1440p quality, and the visualization is closer to 25% of the total video height, so it's hard to even make out the box for the oncoming vehicle.

yield mail.jpg

Whereas this 1080p video has the visualization closer to 50% of the height and the orange box is more clearly visible showing that FSD beta knew it needed to yield right of way.