Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Beta 10 Software Update | Initial Drive and Impressions | 2020.48.35.1 - 17:02 - Dirty Tesla
Looks like some phantom swerving for what the neural network thought was a small animal.

phantom swerve.jpg

I wonder if Autopilot internally identifies the type of animal, e.g., dog, cat, squirrel, bird. From the placement of the blue box in the visualization, it seems like near the corner of the sidewalk is the storm drain and/or pile of snow that confused it. I suppose it wouldn't be too unusual for snow to be shaped like an object of interest.
 
Could a FSD beta tester (e.g., @DirtyT3sla ) clarify with Autopilot / early access team email if reports of California Stops are useful? From the internal developer Autopilot settings view, there's a "Clear To Go" toggle for "California Stop" which allows for these rolling through stop signs at 5mph. In fact, those on the public release builds can tap to proceed through stop signs at 5mph as well (whereas if the vehicle's speed is higher, it ignores your tap to proceed).

clear to go.png


Other testers have pointed out a limited number of video report button presses before older ones get overwritten due to limited vehicle disk storage. Autopilot team probably treats this as a "feature" and not a "bug" or at least it's a known behavior, so skipping these reports could be a better use of limited resources.
 
  • Helpful
Reactions: scottf200
FSD Beta 48.35.1 (build 10) 36 minute unedited drive in Newport, RI - 36:13 - Kim Paquette

4:50 - here is a blind turn that worries me. The fence in the right blocks vision almost completely, requiring a driver to lean forward to see past it, I think, which makes it very difficult, possibly impossible for the side cameras to see given their placement, without the car creeping so far forward that it’s actually in the road. This might be a situation where Tesla needs to have the fisheye lens at the front try to see, if they can. If they can’t do that, then I’m not sure what they can do other than have the car yell for help and/or try to remember and avoid these corners completely once encountered once. It’s also possible that the car saw perfectly fine when it decided to go and Kim overrode unnecessarily, but if I was Kim I wouldn’t be betting on that either.

Ten minutes in and I am impressed, even with the mistakes and hesitations. The roads are narrow with a ton of parked cars with a few situations that have a number of right and left turns in succession and the car handled them fairly well. Not well enough that I’d trust it without a human overseer who can override or tap the accelerator, but it’s starting to look like it might be possible.

3:55 - car misses turn. One key feature I continue to think Tesla needs really badly is to have the car handle missed turns/lanes safely. If the car is about to miss a turn or is in the wrong lane, it isn’t safe for the car to freak out like it did. It tries to take the turn waaaaaay too late after missing it, possibly due to the parked car blocking vision of the lane. Tesla either needs several routes already calculated that it can shift to seamlessly if it misses a turn or is in a wrong lane, or it needs to be able to go against a set route if it isn’t safe to make the maneuver and just let the GPS re-calculate.

28:34 - bad unprotected left. Looked like it was going to pull out in front of an oncoming truck, forcing a driver intervention.
 
Last edited:
  • Informative
Reactions: pilotSteve
4:50 - here is a blind turn that worries me. The fence in the right blocks vision almost completely, requiring a driver to lean forward to see past it, I think, which makes it very difficult, possibly impossible for the side cameras to see given their placement, without the car creeping so far forward that it’s actually in the road. This might be a situation where Tesla needs to have the fisheye lens at the front try to see, if they can.

Are you sure the forward looking side cameras will not do the trick?
upload_2021-1-20_7-19-10.png

upload_2021-1-20_7-24-12.jpeg



Once the car is with its nose into the street it should have decent vision sideways. Some angle, some creep and I guess it would very soon realize if it needs to slam the brakes or if it is clear.
 
Last edited:
[FSD Beta 10] handles unprotected left turn where car doesn’t move - 17:05 - Tesla Owners Silicon Valley
Some reason I found this line/lane prediction failure to be particularly amusing:

not a lane.jpg

Autopilot knew there were 2 lanes to turn into via map data, and it started out fine until the vehicle on the left covered up the tip of the straight arrow road marking. That lane is so wide that the neural network believed the visible tail of the arrow was a skip stripe splitting the lane, so Autopilot made a last second "correction" to get into the "2nd" turn destination lane cutting off the inner turn lane vehicle.

I suppose what I found amusing was that these are the types of corner cases where a painted line isn't actually a painted line when the view isn't obstructed.
 
Are you sure the forward looking side cameras will not do the trick?
View attachment 628990
View attachment 628991


Once the car is with its nose into the street it should have decent vision sideways. Some angle, some creep and I guess it would very soon realize if it needs to slam the brakes or if it is clear.
I have no idea. Until someone pulls video from the side camera and shows what the car can see in that situation, there’s no way to know for sure for us lay people.
 
FSD Beta 48.35.1 (build 10) 36 minute unedited drive in Newport, RI - 36:13 - Kim Paquette
4:50 - here is a blind turn that worries me. The fence in the right blocks vision almost completely, requiring a driver to lean forward to see past it, I think, which makes it very difficult, possibly impossible for the side cameras to see given their placement, without the car creeping so far forward that it’s actually in the road. This might be a situation where Tesla needs to have the fisheye lens at the front try to see, if they can. If they can’t do that, then I’m not sure what they can do other than have the car yell for help and/or try to remember and avoid these corners completely once encountered once. It’s also possible that the car saw perfectly fine when it decided to go and Kim overrode unnecessarily, but if I was Kim I wouldn’t be betting on that either.

Are you sure the forward looking side cameras will not do the trick?
Once the car is with its nose into the street it should have decent vision sideways. Some angle, some creep and I guess it would very soon realize if it needs to slam the brakes or if it is clear.
I have no idea. Until someone pulls video from the side camera and shows what the car can see in that situation, there’s no way to know for sure for us lay people.

I overlaid the Tesla 3 view pattern as best as I could. Taking the fence into account, there is a certain amount of road on the right that is not covered by the cameras.


Tesla View2.jpg
 
  • Informative
Reactions: heltok
I'm not very happy with the quality of the footage though. Night time recording isn't good.
Nighttime makes the visualizations much easier to see as there is much less screen glare. Here's a neat example of how superhuman reaction speed can make up for not yet superhuman perception:

before frame.jpg

after frame.jpg


These two consecutive frames at 12mph is about half a foot distance traveled, and Autopilot immediately updated its path to correctly find the right turn which it couldn't see clearly from fencing and buildings. At least in this case, even though the perception was relatively slow to find the road, there was enough time to smoothly complete the rest of the turn.
 
FSDBeta 10 - 2020.48.35.1 - Short Drive Low Traffic with Good Cyclist Interaction
This is a different example of superhuman abilities making up for not yet superhuman perception:

before bike.jpg

after bike.jpg


These frames show the neural network incorrectly predicting paths into the bike lane, but most of the predictions including the final ones to complete the turn indeed correctly followed to the actual driving lane. But while the driver was likely looking left for oncoming traffic, the biker got in the A pillar blind spot, and the predicted path immediately went wider because the car is looking in all directions at all time.
 
Tesla FSD stress tested in Berkeley, CA (Beta 10) - 19:28 - AI DRIVR
Quite a few impressive maneuvers here. I agree with the comment of how "boring" parts are sped up or cut out in this video and others where FSD beta is making moderately complex turns and lane adjustments for incorrectly parked vehicles, etc. This continuing straight through an offset intersection was particularly interesting for me:

offset.jpg

Not sure how FSD beta internally represents these intersections, but at least looking at navigation, it says to make a right turn then an immediate left turn. Notice the predicted blue path does indeed follow a right and a left, so somehow Autopilot is selecting that one path of many predicted paths as the best match as opposed to simply following the next direction which would incorrectly have picked a right turn.

Another interesting segment was FSD beta realizing it needed to yield to merge even before crossing the intersection:

merge.jpg

The vehicle directly to the left is orange-ish indicating yield while the car parked past the intersection is red and the usual green vehicle ahead to follow. The blue path also shows it knows it needs to shift left ahead. Most likely this was assisted by map data where it knows there's a single lane ahead of the intersection. (This is also likely the reason why some beta testers have encountered mid-intersection lane shifts if map data lacks or has the wrong lane count data.)
 
  • Like
Reactions: powertoold
at 3:15 he points out how FSD is determining the path of joggers way away from the intersection/road and stops to let them through early.
I don’t really like how he doesn’t intervene or report at all whenever the car is doing something iffy. From what DirtyTesla has said, Tesla treats hitting the go pedal the same as any other intervention and reviews that incident for improvement. He should be hitting the accelerator if the car is stopping or hesitating for no reason, or failing that at least hit the report button so Tesla can see there’s something there they can improve.

edit -

12:25 - weird how it treats this pedestrian differently than it did the ones at 3:15. At 3:15 it was overly cautious and stopped when it didn’t really need to since it had plenty of time to complete the turn before the joggers got to the street. Here it detects the pedestrian as soon as he starts walking across the road but doesn’t brake until fairly late when a human driver probably would have been slowing far earlier. Maybe because this one was walking and the others were jogging? Or maybe just because it’s more cautious on turns than it is going straight?

13:13 - Jesus, dude let’s the car screw up in a big way. Car properly stops at a stop sign where it needs to cross four lanes of fast moving traffic. Car properly waits until straight moving traffic passes, and then just goes, ignoring a car that was yielding to turn left that had the right of way, impeding another driver on the road and he lets it happen! WHY? Guy doesn’t even report it afterwards. Come on! Be courteous and safe for the other drivers on the road, you’re the one who signed up to be a beta tester, not them. Put yourself and your car in danger or inconvenience yourself if you want by letting FSD fumble through, but don’t do it to others.
 
Last edited:
  • Like
Reactions: mikes_fsd
I don’t really like how he doesn’t intervene or report at all whenever the car is doing something iffy. From what DirtyTesla has said, Tesla treats hitting the go pedal the same as any other intervention and reviews that incident for improvement. He should be hitting the accelerator if the car is stopping or hesitating for no reason, or failing that at least hit the report button so Tesla can see there’s something there they can improve.
I agree, and I've thought that myself...

But in this case, as a runner, I actually appreciate this behavior!
There are so many people that do not take pedestrians/runners into account.
 
I agree, and I've thought that myself...

But in this case, as a runner, I actually appreciate this behavior!
There are so many people that do not take pedestrians/runners into account.
That was pretty minor and I didn’t really mind that one. But when he complains later on in the vid about FSD screwing up a stop sign by sitting there too long when the car in the cross street was turning right and doesn’t intervene by hitting the go pedal or even report it, it frustrates me because Tesla wants that data and it doesn’t look like he’s giving it to them. And that’s nothing compared to how flabbergasted and pissed I was to see him just let the car screw up the stop sign later in the video when it ignored the car turning left that had the right of way and just proceeded. Good on the other car for saving both of them by stopping, but if the car is driving like a dick, it’s on the human steward to stop it and let the FSD team that it’s doing something wrong.
 
  • Love
Reactions: pilotSteve