Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
How long before it can adjust speed to match road conditions
Unless that is easy to do, then likely a long time. That should be fairly low on priority, since safety critical functions should have top priority, and the safest way to address unsafe road conditions for now is to have the car just refuse to enable FSD in abnormal road conditions. If and when Tesla gets the car working well in normal road conditions, then they can start addressing how to handle unsafe road conditions.
 
New video! Took some of your comments into account. Hope you enjoy it. Make sure to let me know if you want to see something different/new or have any ideas. I'll gladly steal them and take credit for all of it (evil laugh)


Just one question: Why do you push the autopilot stalk so hard? Like a 1960s gear switch. You seriously could do it very very gently with one finger only whilst your hand is on the wheel at the 3 o'clock position.
 
  • Funny
Reactions: boonedocks
New video! Took some of your comments into account. Hope you enjoy it. Make sure to let me know if you want to see something different/new or have any ideas. I'll gladly steal them and take credit for all of it (evil laugh)


Thank you for doing a video in Chicago! It is good to see how this would work in the city. The constant 4-way stops and other traffic on city streets with some double-parked vehicles is a good test. It looks like this is on a Sunday morning with a lot less traffic.
 
  • Like
Reactions: Frenchie
CMePrint: Tesla Full Self Driving (FSD) Beta 9 Downtown Los Angeles Chinatown Software 2020.48.26.1

wrong way scooter.jpg


Interesting situation at 2:20 where there's an electric scooter going the wrong way, and FSD beta initially plans a path left probably assuming the "pedestrian" doesn't move. This got me thinking of the earlier video trying to quantify / rate how good FSD beta is doing, and this case didn't require a disengagement with the vehicle slowing down and staying in the lane as the two of them got closer, but it wasn't totally right either.

It seems like one potential metric is something based on time periods, e.g., every 30 milliseconds, that an autonomous system is making a correct decision as even for a "simple" situation of staying straight, the system could have made a "wrong" decision to turn and/or change acceleration. This avoids flawed metrics like counting disengagements per distance that plenty of companies dislike especially if they're testing in complex urban environments. But then this time-based one is also not great in that the baseline performance is very high as even basic pre-FSD Autopilot correctly stays in lane going straight most of the time as… that's what most of driving is.
 
  • Like
Reactions: Huskyf
This made me realize one other big difference between what Tesla is showing and what Waymo was showing back around 2014-15. Waymo showed predicted paths for objects, but Tesla does not. Seems like Tesla is not predicting the paths of other road users just yet, or only in a limited way like "will keep following the lane it is in" or "is indicating left".
Or maybe Tesla has decided not to display the predictive path information making the comparison moot. Or maybe Tesla doesn't have this capability yet. Or there could be other reasons so making assumptions without additional information doesn't help much.
 
Seems like FSD updates and interest has waned right off from a few weeks ago. And nothing out of Tesla or Elon about any progress. . Maybe kicked down the road another year?.

I don't think we can say that. It was the holidays where the FSD team probably took some time off. So it would make sense if things were a little slow. I don't think we can conclude that FSD has been delayed a year. But I am curious when FSD Beta actually does get out of Early Access. That will be a big milestone.
 
  • Like
Reactions: NHK X
Thanks for planning out interesting waypoints that are short to force FSD beta to do more complicated maneuvers including a U-turn. Indeed most navigation is just driving straight, so many short "trips" concentrate the information without needing as much post-editing to skip/speed up boring sections.

One neat failure of birds-eye-view prediction transitioning vision from main camera to wide camera:
birdseye fail.jpg

The visualization for parked vehicles and predicted path were nicely lined up just until this moment where the main camera lost sight of the Kia Soul, and maybe due to the distortion of the fisheye camera, the unified birds-eye-view predicted a duplicate parked vehicle that was way into your lane resulting in a sudden swerve to a left path. Maybe angled parking isn't as common in the training set?

In any case, good reminder that FSD beta drives on what it "believes" not actually what it "sees."
 
Last edited:
Haha… Just as we were saying staying straight is something pre-FSD Autopilot was pretty good at, looks like FSD beta regressed with 2 incorrect lane changes towards the end of this mostly-straight video from Tesla Owners Silicon Valley:

intersection.jpg

Notice the steering wheel turning left to switch to the left lane inside the intersection without signaling. John does report the issue after the car corrects itself to stay in the right lane.

bad change.jpg

Soon after the previous intersection, the car decides to switch to the left lane, which John points out is the incorrect lane as it becomes a left turn lane.

Both issues are most likely because the OSM data doesn't indicate how many lanes are available, so the path planner probably assumes 1 in each direction giving preference to the lane closer to the yellow dividing line as the lanes further from center are more likely to be bike lanes. This results in a sudden change crossing the intersection without signaling because the path planner believes it's the only lane going straight.

I wonder how heavily Tesla will end up relying on the fleet to generate correct lane data for map updates vs training the neural network be confident enough to override the bad/missing map data. Tesla most likely knew the limitations of relying on inaccurate map data, so it'll be interesting to see how they prioritize a generalized solution for the short/mid/long-term.
 
The visualization for parked vehicles and predicted path were nicely lined up just until this moment where the main camera lost sight of the Kia Soul, and maybe due to the distortion of the fisheye camera, the unified birds-eye-view predicted a duplicate parked vehicle that was way into your lane resulting in a sudden swerve to a left path. Maybe angled parking isn't as common in the training set?

I know it's not FSD beta, but I wonder if that is an explanation for why when overtaking lorries (trucks) in inside lane of a UK motorway, on passing, the lorry then jumps into middle lane?

Unfortunately it usually results in a high speed aborted lane change and/or phantom brake event.


Source: Cruise control no longer safe
 
Last edited:
  • Like
  • Informative
Reactions: Mark-R and rigolith
What is the reason for using OSM data and not some more complete data from say Google or Here?

Has anyone tried it in a covered car park yet or does it not do those?

Dirty Tesla tried it, and the car wasn't able to make the first turn. The turn was pretty tight. Don't know if the fact it was covered was the issue, though. It sometimes has problems with simple paths in uncovered parking lots. At other times, it is able to navigate a torturous path out of the lot.

The speculation is that Tesla hasn't specifically addressed parking lots in fsd beta yet.
 
Last edited:
Seems off that they wouldn't show it though, it's really useful in understanding what the car is planning to do. When you look at a lot of the near accidents it has the driver could have taken over much earlier if they could see that information.
So instead of watching the road you want decisions whether to take over FSD to be made by staring at the screen? Certainly great to have visualizations but you've then just created a new set of risks.
 
What is the reason for using OSM data and not some more complete data from say Google or Here?
Practically, Tesla has been using map data based on OSM for quite some time before FSD including Navigate on Autopilot and Smart Summon, and for all those uses, Tesla already had the data in the car for offline navigation. So maybe Tesla figured the data was good enough to bootstrap the Autopilot functionality with intention to use it to train a general solution not relying on that particular map data. Also, OSM data can be more detailed and accurate in some areas and Tesla seems to be able to overlay its own data when they need to.

As to why not using some other map provider? I guess there's some costs to consider, which may not be unreasonable to reconsider past decisions now that Tesla has been profitable. But back to the previous point of this map data being a short term solution, maybe Autopilot team thinks they're close to figuring things out, so dealing with switching/adding map providers could be more work that would be wasted.