Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
It's beta, we're doing exactly that. You agree to it when you sign up for it. That's why it hasn't been released to all Tesla's, just those that jumped through their hoops to get it because they really, really wanted to beta test for them. I have no complaints about accepting it. I point out the flaws for knowledge and informative reasons, but, having experienced AP since day 1, I also know and accept that it has significant limitations.

If you abuse the system and don't do as the instructions say, then you'll have problems. You always have the right to turn it off if you don't want it.
There seems to be a somewhat common misunderstanding of what ‘beta testing’ means (and how it’s different than production products like Tesla AP that are still labeled as beta e.g. - games like TF2 for 10 years, Google Gmail for 5.5 years, many Google products for that matter, etc.)

Issues like this 10.3 rollout should be expected (or at least unsurprising) for anyone familiar with beta testing.
 
It's beta, we're doing exactly that. You agree to it when you sign up for it. That's why it hasn't been released to all Tesla's, just those that jumped through their hoops to get it because they really, really wanted to beta test for them. I have no complaints about accepting it. I point out the flaws for knowledge and informative reasons, but, having experienced AP since day 1, I also know and accept that it has significant limitations.

If you abuse the system and don't do as the instructions say, then you'll have problems. You always have the right to turn it off if you don't want it.
Other road users are not asked for agreement..

e.g. white car in 0:23 was not asked, whether it wanted to participate in this testing.

 
Other road users are not asked for agreement..

e.g. white car in 0:23 was not asked, whether it wanted to participate in this testing.

Have had the same thing happen to me using it. It is something in the BETA that needs work. Difference between that driver and myself, I chose to NOT LET IT PULL OUT IN FRONT OF THE OTHER CAR. The video shows, that DRIVER DID ALLOW IT TO.

In safety situations, KNOWING that FSD has limitations, (Tesla even states that it will do the wrong thing at the wrong time in the acceptance instructions), I always choose the safe approach rather than finding out if it actually made the mistake I thought it was going to make.....and usually does. Right now, based on others posts and my own experiences, this is one of the issues it has. You learn from it and you prepare yourself. I've learned AP's weaknesses and don't even bother letting it get there and override AP before it has the chance to do something wrong.
 
Have had the same thing happen to me using it. It is something in the BETA that needs work. Difference between that driver and myself, I chose to NOT LET IT PULL OUT IN FRONT OF THE OTHER CAR. The video shows, that DRIVER DID ALLOW IT TO.

In safety situations, KNOWING that FSD has limitations, (Tesla even states that it will do the wrong thing at the wrong time in the acceptance instructions), I always choose the safe approach rather than finding out if it actually made the mistake I thought it was going to make.....and usually does. Right now, based on others posts and my own experiences, this is one of the issues it has. You learn from it and you prepare yourself. I've learned AP's weaknesses and don't even bother letting it get there and override AP before it has the chance to do something wrong.
Oddly it looks like FSD started to go, pulled a very hard left and the driver then turned right and crossed the road into the path of the car. But he was forced to take control after FSD had put the car in a head-on facing position on the side of the road.

I still can't figure out what FSD was trying to do. The path prediction was doing a weird extending/contracting thing, then started across the road but the path then shortened completely and turned into a little nub pointing hard left. Haven't seen that before. I'm not sure what FSD was going to do next but it was the driver who chose to turn hard right and floor it across the road, he could have stopped instead and would have confused the other driver who would probably have stopped. But at least it would have avoided a near collision.

Yes, drivers monitoring FSD beta do need to stop more instead of going for it. That was a weird one.
 
  • Like
Reactions: impastu
Is it just me, or do others find the way you can disengage FSD to be troublesome?

Seems to me there are three ways to disengage:
  1. Push the shifter up
  2. Hit the brakes
  3. Manual steering input
Mostly, this is just like NoA, so its consistent (which is good). However, what I dont like at all is that #3 (manual steering input) does not disengage basic TACC, so FSD stops controlling the car, but it continues plowing ahead. I can see that being logical for NoA on the freeway (where the car suddenly slowing down might be dangerous), but that's different from city streets.

When FSD goes wrong, it mostly for me is because the car goes in the wrong direction, and my reaction of to take control of steering to correct this .. but of course this leaves TACC enabled, which I find unexpected (and dangerous). My feeling is FSD is better off treating manual steering input as a complete return to manual driving (like hitting the brakes).
 
Is it just me, or do others find the way you can disengage FSD to be troublesome?

Seems to me there are three ways to disengage:
  1. Push the shifter up
  2. Hit the brakes
  3. Manual steering input
Mostly, this is just like NoA, so its consistent (which is good). However, what I dont like at all is that #3 (manual steering input) does not disengage basic TACC, so FSD stops controlling the car, but it continues plowing ahead. I can see that being logical for NoA on the freeway (where the car suddenly slowing down might be dangerous), but that's different from city streets.

When FSD goes wrong, it mostly for me is because the car goes in the wrong direction, and my reaction of to take control of steering to correct this .. but of course this leaves TACC enabled, which I find unexpected (and dangerous). My feeling is FSD is better off treating manual steering input as a complete return to manual driving (like hitting the brakes).
I do have to agree with this. I've never been a fan that the car keeps accelerating on its own after having to take over. Your mind tells you, you've taken control, yet you don't have full control. I do wish that any type of override would disengage all control from the car until I give that control back to the car.
 
Other road users are not asked for agreement..

e.g. white car in 0:23 was not asked, whether it wanted to participate in this testing.

This is also the case in any product testing, whether internal or external. In other words, the public‘s consent isn’t ever required for product testing.

Even in the “autonomous vehicle world”, the pedestrians of SF never consented to GM’s Cruise testing on the same streets, or Phoenix suburbians never consented to Waymo taxis on the same streets, or Elaine Herzberg never consented to Uber testing on the same street, etc.
 
This is also the case in any product testing, whether internal or external. In other words, the public‘s consent isn’t ever required for product testing.

Even in the “autonomous vehicle world”, the pedestrians of SF never consented to GM’s Cruise testing on the same streets, or Phoenix suburbians never consented to Waymo taxis on the same streets, or Elaine Herzberg never consented to Uber testing on the same street, etc.
You are technically correct. But it also means, that "they opted in" is a moot point if product causes harm to third parties.
 
You are technically correct. But it also means, that "they opted in" is a moot point if product causes harm to third parties.
In a sense, yes. Depending on what‘s being consented to, it is a moot point. People are conflating consent with risk. For example, in the same example above, what exactly is the white car (driver) supposedly not consenting to? Participation? Nobody’s consent is expected before putting a new product on the road. In fact, children passengers or playing nearby or even adult drivers never consented to any cars being on the street. Or if we are talking about head-on collisions, FSD beta drivers didn’t consent to those either. Tesla isn’t necessarily legally exempt if a head-on collision occurred above, even when talking about the FSD beta tester that ”consented”. What we consented to was the responsibility we have as drivers monitoring a system that has known risks. And in that regard, others’ consent isn’t necessary. Tesla (and FSD beta testers) are liable for certain risks, but not all.
 
Other road users are not asked for agreement..

e.g. white car in 0:23 was not asked, whether it wanted to participate in this testing.

Silly. How many were asked to drive with drunk drivers, sleepy drivers, drivers who are getting off amphetamine high and about to crash, etc...?
The point is that FSD beta plus an alert driver is safer than others.

If your goal is safety you'll get more accomplished by advocating camera based driver monitoring systems in all vehicles.
 
Last edited:
Silly. How many were asked to drive with drunk drivers, sleepy drivers, drivers who are getting off amphetamine high and about to crash, etc...?
The point is that FSD beta plus an alert driver is safer than others.

If your goal is safety you'll get more accomplished by advocating driver monitoring systems in all vehicles.
I started to write the exact same message and changed my mind. Decided I was done arguing the point. But, absolutely true. We choose to get on the road every day with drivers that we have no clue what their situation is, many of which shouldn't be on the road. This is a system designed to improve safety and in many aspects it has. Tesla releasing it to more people who passed their "test" will allow it to gain more data faster to get to the desired results sooner. So, maybe, I repeat, MAYBE now, we're only 3-5 years away from true FSD, whereas without the added beta testers, likely were at least 10 years out.
 
Yeah, people should check out this thread if they want to know what real catastrophe is:
 
  • Like
Reactions: nvx1977
Yeah it was just bad all around. With the angle and elevation I’m not sure if it could even see cars traveling on the road in the direction I wanted to go, I’m really glad that lane was clear for when I took over and got out of the way of the white car. Probably thought I was drunk. Made sure to report it
Most drivers would turn to the right, stop when approaching the intersection then creep as necessary to see oncoming traffic. In other words squaring off a bit before proceeding. FSD doesn't seem to have this capability yet especially when the roads don't intersect at 90 degrees.
 
I accepted the 36.5.1 update yesterday, and it appears to have installed (that's the version appearing on my software screen). However, my phone app still shows that it's downloading 36.5.1. It's at 100% and I have the spinning wheel of hope. Similar on the car's screen. It shows 36.5.1 still being downloaded...full green bar with 0 b/s. I've rebooted the car using the two-button technique...no change.

I wonder if it will eventually timeout. I've taken the car for a short drive and everything works (except I have no Beta!) Any suggestions? Swing into service center? I'm doubtful the FSB Beta email address would respond, but I could try that.
 
I accepted the 36.5.1 update yesterday, and it appears to have installed (that's the version appearing on my software screen). However, my phone app still shows that it's downloading 36.5.1. It's at 100% and I have the spinning wheel of hope. Similar on the car's screen. It shows 36.5.1 still being downloaded...full green bar with 0 b/s. I've rebooted the car using the two-button technique...no change.

I wonder if it will eventually timeout. I've taken the car for a short drive and everything works (except I have no Beta!) Any suggestions? Swing into service center? I'm doubtful the FSB Beta email address would respond, but I could try that.
Give it a little time to self-correct and if that doesn't work simply request service on your Tesla app. They will be able to fix it remotely.

I had an update pause in mid download a few months ago because Tesla discontinued that particular release. I just happened to be caught mid download when they stopped. Service handled it remotely and pushed a new download.
 
  • Informative
Reactions: lurker131
Silly. How many were asked to drive with drunk drivers, sleepy drivers, drivers who are getting off amphetamine high and about to crash, etc...?
The point is that FSD beta plus an alert driver is safer than others.

If your goal is safety you'll get more accomplished by advocating camera based driver monitoring systems in all vehicles.
Yes and this is why traffic laws exist. People who are caught driving drunk/high lose their driving privileges, and falling asleep at the wheel would lead to varying discipline depending on the circumstances.

Those of us who follow this stuff, I’m sure we can all imagine the NHTSA/NTSB licking their chops with incidents like this happening
 
  • Like
Reactions: FSDtester#1