Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
Given multiple reports of various issues in more rural settings, this is an area the FSD Beta drivers can particularly make a big difference by testing and providing feedback. The Beta release development likely had far too little data from rural areas.
I got beta with the 10.3 release on Sunday morning and was at a friend’s place in ME. On our drive home, I used beta for the first time and we were surprised at how well it worked on the backroads. No phantom braking issues, etc. Weather was clear skies and sunny.

Now, in the city in the Boston area, it’s still impressive for what it’s able to do but it’s not an enjoyable experience for passengers who aren’t into beta testing. 😉
 
Has anyone noticed if FSD slow downs when taking a tight highway exit and/or on ramp during medium to heavy rain? Today I disengaged worried the high speed might cause the car to lose traction. Normal exits/on ramps are fine. I'm referring to sharp turns which FSD will need to address eventually. Cannot image using FSD the way it works now when there is snow on the road.
I see we’re both in the Boston area… so given the weather we had yesterday (light rain), FSD beta did a great job getting off I-93 in Medford. This was around 4:30pm and I was surprised how well it did (actually, near perfect as I would’ve done it) compared to the AP / NoA code. That rarely took the same exit well.
 
I got beta with the 10.3 release on Sunday morning and was at a friend’s place in ME. On our drive home, I used beta for the first time and we were surprised at how well it worked on the backroads. No phantom braking issues, etc. Weather was clear skies and sunny.

Now, in the city in the Boston area, it’s still impressive for what it’s able to do but it’s not an enjoyable experience for passengers who aren’t into beta testing. 😉
I think it does better in rural settings. Last night went to son's for 40th birthday dinner. FSD navigated through Mt Holly which is an old town with narrow streets. FSD did not do very well there. I also had another go left when it's supposed to go right. Coming up from the bottom to a 'Y' this time. Voice says go right and the car started left. It was a 'Y'. I don't know if the blinkers were on when I disengaged. Can't remember if it was that one or shortly after that it was so far gone that I couldn't make the maneuver myself and had to let it navigate back to the route (it did that fine).
 
Last edited:
I see we’re both in the Boston area… so given the weather we had yesterday (light rain), FSD beta did a great job getting off I-93 in Medford. This was around 4:30pm and I was surprised how well it did (actually, near perfect as I would’ve done it) compared to the AP / NoA code. That rarely took the same exit well.

I've had mixed results in rain. yesterday was misty. FSD did ok. But on winding 2-lane roads, it has a tendency to cross the double-yellow when roads are wet and reflective. various grades of asphalt can make a big difference in reflectivity. Car also goes ~10mph slower on these roads, so I end up driving with the accelerator down. But I went through parts of downtown Lowell yesterday and the car did great moving around people and double-parked cars. Pavement was cobblestone though :)
 
Lots of brains and lots of cash does not equal success. It helps, sure.

Tesla will certainly be the first to claim something, but they won't be the first to achieve anything special in self-driving other than pushing a product to market far too early.

But the thing is, Tesla is behind all the other major players. Why? Everyone else has better strategies to get to L4 and beyond. Tesla was ahead early on, getting the basics of staying in the lane and following at an appropriate distance while on the freeway. Tesla took the low hanging fruit, and good for them. However, the system is just awful at pretty much everything else. From Guidehouse:

73f3ec9c-4084-4e1d-9eb3-1ef1b4fc4250-lb-ads-21.png


I think Guidehouse probably got it right.

2045 is an optimistic timeframe for L4 or L5 from Waymo and similar, with Tesla far behind.

Speech to text is still mediocre to awful across the board. Oh, if you speak slowly and enunciate, it's OK until there's a homophone. Then it's a clustercluck. How about two homophones in a row? Oh dear. Now try speaking at a normal speed, without extra enunciation.

And speech to text is a far easier problem than computer vision and decision-making in a car.

Tesla tries to solve the problem with cheap sensors and semi-expensive processing, essentially believing in the religion of neural nets. Actual L5 self-driving cars are likely going to take not only a lot more sensors, and some expensive ones at that, but also a several orders of magnitude increase in processing power. No relatively simple algorithm is going to solve this multi-faceted puzzle.

Either there's going to be some genius AI breakthrough from somebody, which is possible but unlikely, or there's just going to need to be oodles and oodles of processing, with the problem broken down into countless tiny steps and sensors, all orchestrated together.
I’ll go out on a limb here and say I’m 100% certain we’ll have a L4 or L5 vehiicle BEFORE 2045, and then out on a branch and say I’m 90% certain Tesla wouldn’t have robotaxis in 2 years. Probably not even 3-4. Maybe not even 5. I’ll set some calendar reminders to check back on. ;-)
 
FSD Beta also did a weird thing of moving to the left lane to make a right turn. It moved over to the left lane as if to pass a car in front and then immediately moved over to the right lane again before making the right turn

Yup, that was introduced with 10.3.1. It doesn't happen every time, thankfully, and it'll also doing it when turning left... moving waaaay over to the right (sometimes all the way to the shoulder) before swinging back over to the left turn lane.

Really got my attention the first time it did it, but fortunately there weren't any other cars around at the time.

I heard from someone who is an expert on Tesla development that this happened because 10.3.1 merged in the Tesla Semi code base for FSD. This shows how far ahead of everyone Tesla is, in their development of autonomous transport. Tesla may solve the US supply chain issues all on their own. ;)


EDIT: I see I got an "Informative" rating on this post. 😂 I want to be clear that I am not serious. I thought the ;) was sufficient, but perhaps not.
 
Last edited:
This was along the same roads as my first run. So, what the hell happened????
Even if the car has been down the same road many times, FSD Beta is negotiating it anew each time. (This also provides FSD with enormous flexibility and adaptability.) So, other vehicles, environmental differences, etc., etc. can affect FSD Beta's behavior from one drive to the next on the very same road. Tesla can collect tons of data on these drives, but those data won't influence the next drive within the same software release. Of course, those data are inputs for the next software release, so future driving on the same road using new software may be different as a result of FSD Beta changes.
 
Last edited:
Given multiple reports of various issues in more rural settings, this is an area the FSD Beta drivers can particularly make a big difference by testing and providing feedback. The Beta release development likely had far too little data from rural areas.
I live in a rural area and I can say that I end up disengaging at the majority of intersections with other cars because my car just does stupid stuff all the time. Phantom braking is a nightmare but only on certain roads (since the .1 release it has gotten more gentle but if a car is behind it keeps lighting up the brake lights) it will brake for lots of shadows, dips in the road, nothing at all, but if you come to an unmapped stop sign doing 60mph it will blow right through lol (well it will slam on the brakes and sound an alarm to let you know a stop sign is going by). After I disengage at intersections with traffic then I don't want to reengage if cars are in front of me because the auto high-beams will more often than not kick on high before I can turn them off then it looks like I am flashing my high beams at the other car. I've had 2 days in a row where it phantom braked pretty hard, then when I tapped the accelerator it still didn't want to go, when I pushed the pedal down to override it triggered a FCW. I even saved the video clip from the dashcam to review when I got home thinking maybe something was near the road that I just didn't see but nope. Constantly having to tap the button to make the wipers go because the car cant understand drizzle and won't let you turn off auto wipers. Anyone know how to retrofit a high beam and wiper sensor set from like a mid 90's car lol? I have been providing feedback but I just provide the feedback once (with the button) and then I'll just wait until the next update. I understood what I was getting for the most part when requesting to join the beta and I am happy to have it and to contribute data to make it better. It's just frustrating at some of the most simple things that it can't handle but can do much more complicated things. I know this is an incredibly difficult task they have and I am looking very forward to seeing how this progresses.
 
That's interesting, and there are certainly lots of serious players in this field. However, I doubt it fully considers the rate at which AI can develop, particularly with Tesla's massive data intake and their approach.
That's the neural net religion, though. Belief without proof that, if we just have enough data flow through the neural nets, then they'll magically figure things out. That works to a certain extent, but not any further.
What is strategy, and how is it ordered? That is, how have they placed them along the horizontal axis?
I didn't buy the full report, which is expensive, but the ranking agrees with at least one other I've seen.

I think the fundamental flaws with the Tesla system are:
Some important positions lacking cameras (four corners pointing straight left and right). There's no cross traffic awareness. Also, humans not only have a binocular swivel head that can see a pretty far range, we also have three mirrors that show us a pretty good range too.
Thinking that, if we just get enough data, the neural nets will solve the engineering problem for us. I cannot disprove this as possible, but as I've read it would take a lot more processing power and levels of neural nets to get there. And even then it's bit of a crap shoot.
Dropping radar. I get repeated warnings when in areas of bright light and shadows. It's silly.
Avoiding LIDAR. That solves a major vision problem easy. Why avoid it?

In actual real-world use, this all shows up in countless ways:
I can see the cars ahead of the car ahead of me, and I can slow down when needed.
I can see the brake lights half a mile down the road, or the bunching of traffic, and give some extra following distance.
I can see the traffic light half a mile away and anticipate.
I can see the guy drifting in his lane and about to signal he's moving over, so I give him room before he signals.
I can see the look on the face of the woman across the intersection and know I should go or she should go.
I can see the guy on the other side of the parked truck, walking rapidly towards the street, and swing wide for when he walks into the road, even though I can't see him for most of this interaction.

As far as I can tell, the Tesla system does none of this. As far as I can tell, the Tesla system barely sees past its own nose. The vision recognition is poor, only showing a fraction of cars on the road that I can see nearby, let alone far away. Does it know about that mid-corner bump coming? Definitely not. A significant amount of my driving involves anticipating and avoiding road imperfections, and Tesla doesn't do this at all.

So I think Tesla took the dark side approach to automotive AI. They took the easy path on the low hanging fruit, but they don't seem to have much of a long term technological strategy beyond more data and more processing.

In sharp contrast, have you looked at all at what Waymo does? It's far more sophisticated.
I’ll go out on a limb here and say I’m 100% certain we’ll have a L4 or L5 vehiicle BEFORE 2045, and then out on a branch and say I’m 90% certain Tesla wouldn’t have robotaxis in 2 years. Probably not even 3-4. Maybe not even 5. I’ll set some calendar reminders to check back on. ;-)
I hope you're right. I don't see any evidence that you are, though.
 
I have a feeling we'll be seeing more news/developments in the future about FSD Beta and multiple drivers

Was watching a video the other day where someone brought up an interesting point about sharing a vehicle with FSD Beta enabled and how there's potential for anyone to get into a vehicle with Beta and get themselves into bad situations or cause other problems for themselves and/or the owner of the vehicle. As an example, imagine dropping your Beta-enabled vehicle off at a valet.

Next up -- facial recognition tied to safety score tied to ability to activate the software?


I wonder if anyone has tried to privately sell a Beta-enabled vehicle for a big premium yet (not sure of the logistics here myself)
 
Last edited by a moderator:
  • Like
Reactions: AlanSubie4Life
I have a feeling we'll be seeing more news/developments in the future about FSD Beta and multiple drivers

Was watching a video the other day where someone brought up an interesting point about sharing a vehicle with FSD Beta enabled and how there's potential for anyone to get into a vehicle with Beta and get themselves into bad situations or cause other problems for themselves and/or the owner of the vehicle. As an example, imagine dropping your Beta-enabled vehicle off at a valet.

Next up -- facial recognition tied to safety score tied to ability to activate the software?


I wonder if anyone has tried to privately sell a Beta-enabled vehicle for a big premium yet (not sure of the logistics here myself)
What I would do if I was going to lend my car is turn off FSD beta in the menu. Also in the past when I have let someone use my car is I lock it in chill mode and with speed restriction. What worries me even more is seeing some of these youtube videos people are putting out about their first drives with beta. I watched part of one today where the person told the nav where to go then just waited .. then he said it must not be ready yet, then said I guess I need to take it out of park .. then waited, then tried tapping the accelerator then said he was waiting for the steering wheel icon to turn blue eventually he figured out he needed to actually use the stalk to engage the system. Stuff like that is terrifying to me, clearly he didn't have the first clue about how to use the system or what to expect.
 
What I would do if I was going to lend my car is turn off FSD beta in the menu. Also in the past when I have let someone use my car is I lock it in chill mode and with speed restriction. What worries me even more is seeing some of these youtube videos people are putting out about their first drives with beta. I watched part of one today where the person told the nav where to go then just waited .. then he said it must not be ready yet, then said I guess I need to take it out of park .. then waited, then tried tapping the accelerator then said he was waiting for the steering wheel icon to turn blue eventually he figured out he needed to actually use the stalk to engage the system. Stuff like that is terrifying to me, clearly he didn't have the first clue about how to use the system or what to expect.
Yeah it definitely seems like there is significant potential for bad things in this area, ranging from the safety of a random driver who isn't the owner of the vehicle to someone who gets in the vehicle and messes up the owner's safety score and Beta admission to people who own the vehicle but still don't really know what to expect
 
What I would do if I was going to lend my car is turn off FSD beta in the menu. Also in the past when I have let someone use my car is I lock it in chill mode and with speed restriction. What worries me even more is seeing some of these youtube videos people are putting out about their first drives with beta. I watched part of one today where the person told the nav where to go then just waited .. then he said it must not be ready yet, then said I guess I need to take it out of park .. then waited, then tried tapping the accelerator then said he was waiting for the steering wheel icon to turn blue eventually he figured out he needed to actually use the stalk to engage the system. Stuff like that is terrifying to me, clearly he didn't have the first clue about how to use the system or what to expect.
Is PIN to drive specific to the car, or to a user profile - meaning each user profile could have a different PIN or PIN lock the one that has FSD turned on? Sure, anything other than a Valet profile could technically turn something ON in the settings, but that would require them to know it’s there and enable.

There used to be this wonderful sign at the top of Nevada Falls in Yosemite park. Also had a picture of a little stick figure man (most likely) going overt he falls. but, the English text here is still the same’ish. That last line being the kicker. Maybe Tesla should put something else in the FSD enablement warning with some stronger language or imagery?
 

Attachments

  • 46DC02E3-9227-486D-A827-C06B5045ED68.jpeg
    46DC02E3-9227-486D-A827-C06B5045ED68.jpeg
    136.5 KB · Views: 103
That's the neural net religion, though. Belief without proof that, if we just have enough data flow through the neural nets, then they'll magically figure things out. That works to a certain extent, but not any further.

I didn't buy the full report, which is expensive, but the ranking agrees with at least one other I've seen.

I think the fundamental flaws with the Tesla system are:
Some important positions lacking cameras (four corners pointing straight left and right). There's no cross traffic awareness. Also, humans not only have a binocular swivel head that can see a pretty far range, we also have three mirrors that show us a pretty good range too.
Thinking that, if we just get enough data, the neural nets will solve the engineering problem for us. I cannot disprove this as possible, but as I've read it would take a lot more processing power and levels of neural nets to get there. And even then it's bit of a crap shoot.
Dropping radar. I get repeated warnings when in areas of bright light and shadows. It's silly.
Avoiding LIDAR. That solves a major vision problem easy. Why avoid it?

In actual real-world use, this all shows up in countless ways:
I can see the cars ahead of the car ahead of me, and I can slow down when needed.
I can see the brake lights half a mile down the road, or the bunching of traffic, and give some extra following distance.
I can see the traffic light half a mile away and anticipate.
I can see the guy drifting in his lane and about to signal he's moving over, so I give him room before he signals.
I can see the look on the face of the woman across the intersection and know I should go or she should go.
I can see the guy on the other side of the parked truck, walking rapidly towards the street, and swing wide for when he walks into the road, even though I can't see him for most of this interaction.

As far as I can tell, the Tesla system does none of this. As far as I can tell, the Tesla system barely sees past its own nose. The vision recognition is poor, only showing a fraction of cars on the road that I can see nearby, let alone far away. Does it know about that mid-corner bump coming? Definitely not. A significant amount of my driving involves anticipating and avoiding road imperfections, and Tesla doesn't do this at all.

So I think Tesla took the dark side approach to automotive AI. They took the easy path on the low hanging fruit, but they don't seem to have much of a long term technological strategy beyond more data and more processing.

In sharp contrast, have you looked at all at what Waymo does? It's far more sophisticated.

I hope you're right. I don't see any evidence that you are, though.
I think you might be suffering from complexity bias. Just because a solution is more complex doesn't mean it's better than a more simple solution. It's often the opposite.
 
Yeah it definitely seems like there is significant potential for bad things in this area, ranging from the safety of a random driver who isn't the owner of the vehicle to someone who gets in the vehicle and messes up the owner's safety score and Beta admission to people who own the vehicle but still don't really know what to expect
I think another area that causes problems is that we live in such a litigious society that there are warning labels on everything and many of them you can just disregard because it is some CYA type of thing. But then you have the warnings that really do matter and people are just like yep, accept, I know I know I have to "pay attention" yea "bad things" can happen sure sure, accept.
 
Is PIN to drive specific to the car, or to a user profile - meaning each user profile could have a different PIN or PIN lock the one that has FSD turned on? Sure, anything other than a Valet profile could technically turn something ON in the settings, but that would require them to know it’s there and enable.

There used to be this wonderful sign at the top of Nevada Falls in Yosemite park. Also had a picture of a little stick figure man (most likely) going overt he falls. but, the English text here is still the same’ish. That last line being the kicker. Maybe Tesla should put something else in the FSD enablement warning with some stronger language or imagery?
According to a google search the pin to drive is car specific although I don't have first hand knowledge of this. I would be more worried about someone accidently engaging it than going into a menu to turn it on, at least for my specific case as I wouldn't be lending it to someone outside of family. Hell one day I was visiting my dad and his truck was blocked in so I told him to just take my car to run to the store (before beta) but he wouldn't because he didn't feel like he understood how to drive "that thing" lol. I guess the nice thing is since the brake disengages everything, if someone was to accidently activate it and the car started doing stuff on its own a persons instinct would probably be to hit the brake.
 
It's pretty good man.

Doesn't know how to u- turn

If the lanes are too close the second the car turns it gets confused

Still annoyed it changes lanes in the middle of the intersection

But overall really good for being only 10.3 beta

I wonder when we're going to get the Live sentry mode update.
 
Well gave FSD one last try to turn into my street from each end. Usually fails both ends even in good weather like today. I've given up on the obstructed end and just hope FSD will automatically reroute me to the other end. On the unobstructed end my last attempt was 15mph so this time tried 10mph. No obstructions and FSD did try to make the turn but eventually gives up and goes past the street. Makes this turn maybe 40% of the time. My neighbors were outside and saw me so I made a point to let them know I was participating in Tesla's FSD beta program. The wondered what was going on when they saw the car jerking back and forth.
 
Last edited: