Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I wonder how much instruction or 2-way dialog Tesla is giving to these early FSD beta testers. Brandon seems to be reporting a lot of issues that I would consider "comfort," e.g., speed changes too fast/slow, lane offset too much/little, distance from other vehicles too close/far. He seems to be annoyed with his tone and noises, and I totally agree these things will need to be addressed, but this is early beta software.

He does snapshot a bunch of actual safety or correctness issues as well, and I would think Tesla is most interested in those cases. But the large amount of snapshotting he does might just be creating extra work for Autopilot engineers to watch and triage. I would guess the team has an internal prioritization to get things ready for a wider release, so many of the types of issues including comfort might already be internally planned to be addressed later.
In the first five minutes there are four bug reports, none of them comfort issues. I think he's reporting real issues that should be fixed before release. A lot of the "comfort" issues are really "do I feel comfortable driving like an idiot and annoying other drivers", those should also be fixed before wider release.
0:59 random turn signal - not a comfort issue
2:15 slammed on brakes for no reason - not a comfort issue
4:06 hard turn into adjacent lane without signaling (there may have also been a car there) - not a comfort issue
5:00 swerve to left in middle of intersection no signal - not a comfort issue
 
Last edited:
Wish Brandon M would chop up his videos into bite size. Hour long videos of driving are just too long. Seems abut 15 minutes max should be the general rule.
I want a super cut of just "Jeeeza!" "Oh my gawd!" and "Gawsh!"
I like his videos because he's actually trying to use it for normal driving in a typical Californian city.
His new camera angle is horrible. It would be nice to see have a feed of all the car's cameras (I wonder if it allows this while using FSD?).
 
Last edited:
I feel like the first couple days we wanted unedited full videos rather than short fan boy sort of clips. I like Brandon’s videos as he is transparent about what’s going on and has constructive criticism. Given the hour long video this is great source material for all of us on the sidelines to hyper analyze.
 
  • Like
Reactions: bd7349
I think he's reporting real issues that should be fixed before release.
Yeah, I did say he was reporting actual safety and correctness issues too. I don't have the timestamps as it's a really long video, but what got me to post originally was his own comments of him wondering if the engineers will know what to look for because he waited a while before snapshotting after the car correctly merged, but he felt it was too close to the other vehicle. That specific merging closeness issue and others like accelerating too slowly after someone pulls out of your lane or too aggressively centering in the lane are "existing" issues of highway Autopilot, and clearly Tesla has prioritized that lower than safety/correctness issue so far.
 
A lot of the "comfort" issues are really "do I feel comfortable driving like an idiot and annoying other drivers", those should also be fixed before wider release.
One specific issue that Brandon got pretty annoyed about was a right turn on red waiting too long and concerns of annoying other drivers also waiting to make a right. In this case, the cross traffic was making a left turn, so human drivers generally would just take the right turn if it's clear nobody is making a U-turn. FSD beta first showed a message of waiting for its turn then another message saying it was creeping up to get better visibility as the left-turning vehicles were blocking the view of the cars that still had the red to go straight across. He ended up taking over as the left turn ended and cars were starting to go straight, but Autopilot had creeped into the crosswalk.

This might be some specific software 1.0 driving policy that just hasn't been written yet, and the current behavior is only looking for stopped traffic on the left vs a smarter behavior would be to realize the left turning traffic will block / stop cross traffic. Ideally the driving policy will eventually do something smarter, but does it need to happen before a wider release for this situation? Also, the human driver also has the option of just pressing on the accelerator to force the right turn on red, but being a beta tester, I believe Brandon was trying to allow the software to do its thing.

Looking back at Navigate on Autopilot deployment, things were definitely cautious and people complained that they would make the lane changes faster, but that initial wide deployment of NoA was trying to be extra safe with unnecessarily long buffers. I wouldn't be surprised if Tesla deployed FSD wider even with annoying but safe behaviors, so that they can actually measure what's a more reasonable smaller buffer using fleet data.
 
One specific issue that Brandon got pretty annoyed about was a right turn on red waiting too long and concerns of annoying other drivers also waiting to make a right. In this case, the cross traffic was making a left turn, so human drivers generally would just take the right turn if it's clear nobody is making a U-turn. FSD beta first showed a message of waiting for its turn then another message saying it was creeping up to get better visibility as the left-turning vehicles were blocking the view of the cars that still had the red to go straight across. He ended up taking over as the left turn ended and cars were starting to go straight, but Autopilot had creeped into the crosswalk.

This might be some specific software 1.0 driving policy that just hasn't been written yet, and the current behavior is only looking for stopped traffic on the left vs a smarter behavior would be to realize the left turning traffic will block / stop cross traffic. Ideally the driving policy will eventually do something smarter, but does it need to happen before a wider release for this situation?

Looking back at Navigate on Autopilot deployment, things were definitely cautious and people complained that they would make the lane changes faster, but that initial wide deployment of NoA was trying to be extra safe with unnecessarily long buffers. I wouldn't be surprised if Tesla deployed FSD wider even with annoying but safe behaviors, so that they can actually measure what's a more reasonable smaller buffer using fleet data.
Wasn't that because someone was waiting behind him? Not making a right turn on red when safe is super annoying to other drivers.
Tesla may do wider release but I don't understand the point of doing so. Of course I'm of the opinion that they should only do things that get them to better than human performance more quickly and in a safe way (as people are fond of saying around here, lives are on the line!).
Right now they have system that looks completely useless as a driver assist system and pretty bad as a prototype self-driving car. I guess some people think the current beta FSD provides value as a driver assist system? I just think Tesla should focus on true FSD and I don't see how wider release will get them there faster.
 
As for the concern about overloading the dev team with bug reports/complaints/incidents, you can be sure they have developed queries and sorting algorithms over the years to deal with the data. I'm sure it's a lot of boolean logic filters like you'd get in an Excel filter ("If" logic, less than or equal to, and/or).

They're most likely filtered, sorted, ranked, and grouped together before a human sits there and looks through the specific details of the cameras and telemetry.