Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
I got 11.4.4 two days ago, and I don't really notice much difference from 11.4.3. It still has a few quirks that 11.4.3 has. I haven't noticed any regression, so far keeping it enabled on my car.

So my trip to and from work can take 2 ways depending on traffic. So far the most interventions I've had to do is 2, often it's just 1. I don't count cancelling lane changes an intervention. This includes the times I've been using 11.4.3. The routes cover a mix of highway driving, merging, roundabouts, left and right hand turns, double-turn lanes, HOV lanes, non-highway roads in a medium density city, construction, and some unmarked roadways. Tested it out on clear days, into the sun, cloudy days, and even heavy rain (which it warned me about).

Hoever, on one left hand turn on one of my routes back home, FSD 11.4.3 and 11.4.4 still seem to cuts into the other turn lane a bit too close (ie. the road that I'm turning left onto). It's probably a learned behaviour from bad drivers.

Another qurik that still exists is the late merging into traffic. The FSD seems follow traffic of the lane that's ending then tried to merge when it gets close to, or at the end of the merge lane. Normally you'd try to merge into traffic at the first opportinity.

On the highway FSD lane changes are much better, and it looks for faster traffic, and will tend to try to get out of the left lane. However, the HOV lanes that EV's can drive in are on the left. At the entry and exit points of the HOV lanes, traffic tends to open up a bit, and FSD falsely thinks traffic is faster in the right lane and will try to get out of the HOV lane. But, if you look further into traffic you can see that the HOV lane is booting along, but the regular lanes have slowed down, so it would make sense to stay in the HOV lane.

Double-turn lanes at a highway offramp also have a quirk. When turning right, FSD seems to favour the outer turning lane. I usually don't like that because often that's a right and left hand turn lane. So if I'm turning right, I tend not to pick that lane since there may be a car turning left that will block me until the light turns green. I usually have to intervene to move to the inner turning lane because the signal is already on, so I can't tell the car to change to the right lane.

One thing I wish is that when you intervene and take over from FSD that it would also disable cruise control. A few times I've interveined but then I'm taken by surprise when the car surges forward since the cruise control was still on.

Other than that, no "oh sh*t" moments.

FSD 11.4.3 and 11.4.4 seem good enough, I'm getting confident enough to add a bit of accelleration when turns or lange changes are going a bit slow for my liking.


PS: has anyone tried FSD without anything dialed up in the navigation? Where does the car go?

--
S.Lam
For me it goes straight until it encounters a tee, then it turns right. If one is close enough to the Home location, it will go home.
 
  • Like
Reactions: Z_Lynx
Yowza! There was more to Ross' drive than reported. Two disengagements. One for driving too close to a backing up trash truck and the other for that near miss after attempting to run a stop sign. They happened within 40 secs and while Ross was jaw boning about how the first disengagement was no big deal. The near miss was fugly.

Against my better judgement, Inactivated FSDj 11.4.4 on my may to pick up a pizza this evening. Probably the worst of the 11.4.x releases I’ve seen on a refreshed MS Plaid. Seems like we are at the limit of HW3 capabilities, and along current camera placement, that last six months of releases has at best, been. Net zero gain.
 
Two short drives today both with the same recurring problem. On a four lane road came to a place where there are two designated right turn lanes spaced about 50 yards apart, there was no traffic behind me, these turns were not part of the route but the car put on the right turn signal and got into the right turn lane drove all the way to the end realized its mistake put on the left turn signal and got into the right most lane of the roadway then 100 feet later did exactly the same thing with another dedicated right turn lane. The second drive on a different road but with the same situation, two dedicated right turn lanes about 100 feet apart, the car did exactly the same thing. The car must have a fedish because it keeps going in and out on dedicated right turn lanes.
 
  • Like
Reactions: Rikster
Two short drives today both with the same recurring problem. On a four lane road came to a place where there are two designated right turn lanes spaced about 50 yards apart, there was no traffic behind me, these turns were not part of the route but the car put on the right turn signal and got into the right turn lane drove all the way to the end realized its mistake put on the left turn signal and got into the right most lane of the roadway then 100 feet later did exactly the same thing with another dedicated right turn lane. The second drive on a different road but with the same situation, two dedicated right turn lanes about 100 feet apart, the car did exactly the same thing. The car must have a fedish because it keeps going in and out on dedicated right turn lanes.
This same behavior happens to me all the time on 11.4.4! It was actually even a bit worse on 11.4.3. It was not an issue at all on 11.3.6. It makes FSDb nearly unusable at times, and certainly makes most drives more nerve-wracking than simply driving myself. I get it in left turn lanes on 4-lane divided highways also. Hope Tesla can figure out a way to fix this major defect soon!
 
We drove yesterday from Loveland CO to Grand Lake CO over Trail Ridge road using FSD 11.4.4.
Exciting knowing you are only a couple feet away from falling down several hundreds of feet.
IMG_9343.jpg

I thought FSD did a great job.
I have done this route several times over the past few years and this is the first time that FSD performed "almost" perfectly.
There are numerous hairpin turns and FSD did the turns smoothly.
A couple of interesting things:
* Another car on the side of the road had their car door open. I was putting pressure on the steering wheel to turn slightly and FSD on its own turned to provide cleanance.
* Several parked cars on the side of the road. A child was moving between two cars heading toward the road. FSD slowed way down. I could see the child was looking at me so we could continue.
 
Here's where vision only fails. It looks 2D but it's dead serious 3D.


It's not working right now, but what about this scenario inherently precludes vision from working? Minimum of two forward facing cameras means you have binocular information, and machine learning can be trained perceive the depth of anything just as well as a human driver.

Binocular vision might not even be necessary. After all, we're able to intuit the depth of the poles from the context and this single view-point video.

This is actually one of my biggest annoyances with discussions around autonomy at the moment. Most arguments boil down to "It doesn't work now, therefore it will never work."
 
It's not working right now, but what about this scenario inherently precludes vision from working? Minimum of two forward facing cameras means you have binocular information, and machine learning can be trained perceive the depth of anything just as well as a human driver.

Binocular vision might not even be necessary. After all, we're able to intuit the depth of the poles from the context and this single view-point video.

This is actually one of my biggest annoyances with discussions around autonomy at the moment. Most arguments boil down to "It doesn't work now, therefore it will never work."
And for others the biggest annoyance is so much isn't working and there's no significant signs of improvement after months or even years of releases and an expanded release base. No doubt given enough time everything fails but for Tesla so often it never gets off the ground.

Is it a poor binocular camera design or an inadequate data set or some combo of both? Who knows but for now it's unsafe and although Tesla might be aware of the issue it's likely very low if at all on the priority list guaranteeing it won't work anytime soon.
 
  • Like
Reactions: jebinc
And for others the biggest annoyance is so much isn't working and there's no significant signs of improvement after months or even years of releases and an expanded release base. No doubt given enough time everything fails but for Tesla so often it never gets off the ground.

Is it a poor binocular camera design or an inadequate data set or some combo of both? Who knows but for now it's unsafe and although Tesla might be aware of the issue it's likely very low if at all on the priority list guaranteeing it won't work anytime soon.
I mean you're not wrong and there are some serious shortcomings.

Drove ~400 miles today. The improvement from 10.5 to 11.4.4 is breathtaking.

Gone are the days of aggressively changing lanes for a cone. A single cone used to throw the whole thing off and bring the experience to a grinding halt.

(Eta: Cones aren't intended to be a straw man here, just a specific example of marked improvement)

Does it still f up. You betcha. Did it make those 400ish miles relaxing today? Definitely.

The iterative changes are small and we expect the world with each new release but the software is undeniably getting better.
 
Last edited:
The trailer isn't properly marked with red flags etc but here's where vision only fails. It looks 2D but it's dead serious 3D.

My god, her poor kids. Listen to how she rants while driving and swerving "this is so dangerous", "are you guys ok?".

Yeah. Please don't test FSD beta when there are kids in the car and you clearly don't understand what its limitations are. Those kids don't care what FSD beta is, they just want mommy to not kill them, and also to not terrify them.
 
I've noticed a huge bump in confidence when making a turn from a light or stop sign. Most of the signs it stops up far enough for me to see past obstructions to double check it's decision, some signs it stops a little bit back but still proceeds with maximum confidence even though I can't see cross traffic, I assume that it wouldn't do that unless it was able to determine the path was safe, it wouldn't be programmed to just blindly take off, would it? Lol I'm still scrutinizing it's every move and sometimes when it makes a manager that I can't double check it makes me a tad nervous
 
My god, her poor kids. Listen to how she rants while driving and swerving "this is so dangerous", "are you guys ok?".

Yeah. Please don't test FSD beta when there are kids in the car and you clearly don't understand what its limitations are. Those kids don't care what FSD beta is, they just want mommy to not kill them, and also to not terrify them.
Well said. Hope that wasn't staged for clicks but it wouldn't surprise me.

FSD videos and related social media are a mess. Even Chuck is trying to muster a Ross defense. Very sad.

In any event it doesn't excuse FSD's poor performance in those 2D dominant scenarios. One could only imagine what would happen at night.
 
  • Like
Reactions: Dan D.
Well said. Hope that wasn't staged for clicks but it wouldn't surprise me.

FSD videos and related social media are a mess. Even Chuck is trying to muster a Ross defense. Very sad.

In any event it doesn't excuse FSD's poor performance in those 2D dominant scenarios. One could only imagine what would happen at night.
Yes, I'm disappointed in Chuck Cook for trying to excuse the stop sign issue. He says some rather high and mighty things in this Twitter exchange, and tries to hand wave off the whole argument with "no one died today, recalibrate". So a very risky, repeatable and verified issue is nothing to worry about. Chuck has lost his safety bearings:

Chuck Cook
Dan .. You lost this battle. The car did great. This stop sign scenario was admittedly not perfect .. but compared to the average human.. your bar is in the wrong place.

snowytrail1
Not perfect? If he doesn't brake hard there, it T-bones that SUV at 30mph.

Chuck Cook
That is the definition of not perfect. FSDBeta driver. Disengage

Gilles Primeau
IF you have enough time to disengage/react; threshold for that is & will remain highly driver-dependent
FSD accidents & near misses (also in TACC) happen in part because Musk gets Tesla to treat this like a PC game where only performance counts to last moment at safety’s expense

GoGators
Not perfect? Come on Chuck.

Chuck Cook
Have I EVER said it was perfect? I am the best person to talk to about how to find a path to better. Cmon..

GoGators
No. And I didn’t say that you said that. You said that it was “…admittedly not perfect..”. I can tell you’re quite humble. Perhaps you could admit that this very specific example was dangerous. I believe FSD has a bright future but this very well could have been a bad accident.

SwingTraderCO
Not perfect = it's ok for innocent people to die

Chuck Cook
No one died today. Recalibrate.
 
Last edited:
  • Like
Reactions: kabin and jebinc
No one died today. Recalibrate.

That's a perfectly acceptable response, given the context. It's starting from O'Dowd proclaiming:

"FSD tried to kill us in an hour of city driving! Everyone should demand it be banned immediately."

FSD Beta made a mistake. It didn't try to kill anyone any more than any other piece of technology tries to kill you on a daily basis.