Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
This is a half-baked theory, so maybe others who got Beta today can chime in to agree/disagree …….

After spending about 3-4 hours on FSD today, I’m thinking that people with a SafetyScore of 95 or lower will not enjoy the current state of FSD. The hardest part of my day was giving the vehicle the time/space to figure things out on its own.

It was quite slow/conservative in many situations, and my biggest concern throughout the day was upsetting drivers around me. The honking, flipping the bird, and frustrated passing were honestly rough.

If your driving style or driving environment has more frequent drivers/situations like this around you, then you’re gonna spend a good bit of time over-riding FSD and wanting to drive manually.
 
FE14234D-5F12-4D4A-A5C2-B4E3B51F05B7.jpeg
Told my wife today that we need to order this sticker if we’re gonna keep using FSD. I kept wanting to apologize to other drivers.
 
It was quite slow/conservative in many situations, and my biggest concern throughout the day was upsetting drivers around me. The honking, flipping the bird, and frustrated passing were honestly rough
Be a good citizen, and help Tesla out with the development of FSD: intervene and disengage whenever it is not driving correctly, or being too conservative or slow (or being too aggressive!). It’s also likely safer to disengage all the time. The car is not intended to be driving itself on FSD! You are the driver.

Best to not annoy other drivers. That definitely does not help Tesla.

The bar should be: have it drive exactly as you would, assuming you are a decent driver who does not normally irritate other drivers.

The more interventions, the better, most likely.
 
This thing uploads insane amounts of data, already up to 13 gigs data uploaded today and counting.
Okay so, does it upload data even if we weren't on FSD engaged? I feel unsafe with FSD on around these streets. Is it still recording videos to be analyzed so in future updates, it could do better? Or do we have to have FSD engaged in order for it to record the surroundings and upload? How can we figure this out? I'm not even sure how to check my router to see what has been uploaded?
 
Okay so, does it upload data even if we weren't on FSD engaged? I feel unsafe with FSD on around these streets. Is it still recording videos to be analyzed so in future updates, it could do better? Or do we have to have FSD engaged in order for it to record the surroundings and upload? How can we figure this out? I'm not even sure how to check my router to see what has been uploaded?
I didn't take any FSD snapshots, and only enabled it briefly on a 12 mile drive, and it still uploaded another 6-7 gigs when I got home.
 
  • Like
  • Informative
Reactions: edseloh and nvx1977
Be a good citizen, and help Tesla out with the development of FSD: intervene and disengage whenever it is not driving correctly, or being too conservative or slow (or being too aggressive!). It’s also likely safer to disengage all the time. The car is not intended to be driving itself on FSD! You are the driver.Best to not annoy other drivers. That definitely does not help Tesla.
The bar should be: have it drive exactly as you would, assuming you are a decent driver who does not normally irritate other drivers.
The more interventions, the better, most likely.
I’d love to see an official Tesla stance in this. Good examples of when to disengage & when to let it try. What do they really want to see reported & what’s the optimal handling for the system to learn? Does disengaging and then correcting actually show what the system should’ve done?

I based my “give it a chance” approach on the FSD testers I’ve watched on YouTube over the last year. I was under the impression they gave it a decent bit of leash if there wasn’t imminent danger.
 
  • Funny
Reactions: AlanSubie4Life
Any disengagements ?

ps : There are a ton of discussions here earlier on kinds of "triggers" Tesla can have in the software. When a trigger condition is met, data is captured and uploaded to mothership.
Yeah I disengaged when I know it's going to attempt something stupid, like the nav route is obviously wrong. Also disengaged when it decided to straddle two lanes instead of staying in one. Did most of the drive manually after that.
 
Yeah I disengaged when I know it's going to attempt something stupid, like the nav route is obviously wrong. Also disengaged when it decided to straddle two lanes instead of staying in one. Did most of the drive manually after that.
I wonder whether you can upload the exact route you want to use (using google maps) - and the fsd beta would follow that route ....
 
So took my 1st drive with the Beta. Really no issues, but I didnt stress it with anything crazy. I am surprised by how hard it accelerated. Started in my subdivision and pointed in wrong direction for the simplest route to make it extend the ride. So the road turns to the left about 5 houses down from our house. When I engaged it really took off quickly. I would say it overdid it. Was a 25 MPH zone and it wanted to get to 25 PDQ. I also have it set for AP to drive 5 over. I need to change that because it got to 30 pretty fast. I also noticed this on some right hand turns at lights. Accelerated a good deal faster then I do. To be clear it did nothing illegal, just more aggressive then I would on a sub-division or on a normal right hand turn.
 
when to let it try

The time I proposed intervening is the moment it begins to do something incorrect or you suspect it will. This may be very frequently. That helps Tesla understand (over time, when they get a chance to go back and analyze carefully and as they focus on particular aspects of the path planning and driving cost functions, assuming they gather sufficient data to be useful, and that they're at some point looking at disengagements) how to improve the system to better fit with human expectations. There's literally zero benefit to allowing the car to continue to drive if it's not driving correctly, except possibly as an exercise on a deserted road to better understand the behavior of the system and to help you better understand where it is likely to misbehave with other drivers present - but this creates the risk of a collision with a fixed object if you are not sufficiently alert.

what’s the optimal handling for the system to learn?
The system does not learn anything from being driven.

I based my “give it a chance” approach on the FSD testers I’ve watched on YouTube over the last year

YouTube is not a great source of this sort of information.

I’d love to see an official Tesla stance in this

I would too, it would help to have an official policy and an official explanation & guide to counter the misinformation. I'm sure they have information available to their internal testers, though maybe I'm wrong.
 
Be a good citizen, and help Tesla out with the development of FSD: intervene and disengage whenever it is not driving correctly, or being too conservative or slow (or being too aggressive!). It’s also likely safer to disengage all the time. The car is not intended to be driving itself on FSD! You are the driver.

Best to not annoy other drivers. That definitely does not help Tesla.

The bar should be: have it drive exactly as you would, assuming you are a decent driver who does not normally irritate other drivers.

The more interventions, the better, most likely.
What if I normally irritate other drivers but have been on best behavior to qualify for FSD?