Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
I've only had the beta for a few days, but so far I'm surprised at how badly it drives. I knew coming into it that it was nowhere near ready for real world use, and that I would have to monitor its actions vigilantly. But in the dozen or so times that I have engaged it, I've never gone more than about 3 minutes before being forced to disengage for some safety reason. In one case it screamed at me to take control - for no obvious reason - no more than 5 seconds after it was enabled. In general, it accelerates much too quickly and waits far too long to apply the brakes. And that's in "Chill" mode.

I've read a lot of comments suggesting that it relies too much on map data (which may be invalid) rather than what it sees with its own "eyes". But there are times when the opposite is also true. On one road that I drove several times over the weekend, there is a stop sign that appears quite suddenly and close just as the car comes over the crest of a hill. Each time, it has to jam on the brakes to stop in time. I can think of at least 3 ways it could avoid that: 1) Reading the map and realizing that there was a major crossroad just ahead, 2) Taking note of the "Stop sign ahead" warning sign as it is climbing the hill toward the crest, or 3) Learning from the last time it crested the same hill. It is apparent that it does none of these.

This is just one example, but there are innumerable others.
 
A new behavior I noticed in 10.8 as I saw someone else mention is it wanting to get into the center divider lane for a upcoming left turn a long way before the intersection. In previous builds it would do this at one intersection then once it was in the divider lane it would swerve back into the travel lane. Now at several intersections (actually all I have encountered so far I believe) it wants to get into the divider lane way too early, but then I come to an intersection yesterday where there was a line of cars and it waits until it gets close to the intersection and can't get into the turn lane. That makes me think the car isn't saying I need to move here to turn, it seems like it's treating it as a travel lane leading up to the intersection which would explain why if there are cars present it won't get over early and get in behind them but when no cars are present it will switch super early. I haven't had a chance to let this play out to see if it will do the swerve back to travel lane behavior or not as there have been cars around every time so I've had to just disengage.
 
  • Like
Reactions: Oil Freedom
I don’t think single stack would be a priority for them anymore - they are probably looking at disengagement numbers in horror now
As in you believe there's so many 10.x betas because they've been trying to reduce the disengagement numbers, so shifting away resources from getting 11/single-stack ready and not so much that 11 required more training to reduce regressions?

It seems pretty clear from using 10.x, a bunch of things presented at AI Day haven't made it to FSD Beta and Elon Musk most recently reiterated with Lex Fridman that various neural networks and training still need to switch to "surround video," which could also be another "single stack" milestone as instead of having many predictions use custom network stacks that were incrementally derived on the way to the "real world AI" architecture, all would share the same structure and training.

Yes there will likely be regressions with a single stack, but it represents Tesla believing they have the correct technical design now and is applying it everywhere (all road types: city, highway, parking and all predictions). They probably have seen significant improvements to networks that are already on the new stack that will likely also fix many types of existing disengagements. And the expanded fleet is sending back lots of video to produce auto-labelled training data to get it ready for release.
 
There are major technical changes announced every year. That will continue.
Haha yeah… there's been changes for manually fused cameras to birds-eye-view, piecemeal recent frames to temporal memory, and radar sensor fusion to vision only; all of those then formalizing into the 360º video with feature queue (temporal and spatial memory). The natural next "rewrite" / refactoring after Beta 11 would be to optimize to better fit computation and time budget constraints, but potentially that should be "easier" in that the training data should already exist by then (instead of changing both the network structure and data collection at the same time).
 
So this was silly. I was driving north on Oregon 99 in Ashland. There are two northbound lanes; we were in the left. FSD announced, "changing lanes to follow route." There was no need for this, as both lanes continued north. It's not as though our lane was shortly going to become a left-turn-only lane. I let it make the lane change. A few seconds after settling into the right lane, it again said, "changing lanes to follow route." And so we scooted back to the left lane, where--you guessed it--it announced, "changing lanes to follow route." Which it did. There was no point to any of these three lane changes. Yes, I sent a video clip.
 
So this was silly. I was driving north on Oregon 99 in Ashland. There are two northbound lanes; we were in the left. FSD announced, "changing lanes to follow route." There was no need for this, as both lanes continued north. It's not as though our lane was shortly going to become a left-turn-only lane. I let it make the lane change. A few seconds after settling into the right lane, it again said, "changing lanes to follow route." And so we scooted back to the left lane, where--you guessed it--it announced, "changing lanes to follow route." Which it did. There was no point to any of these three lane changes. Yes, I sent a video clip.
Yeah, I've seen similar nonsense at times.
 
  • Like
Reactions: Sporty
Haha yeah… there's been changes for manually fused cameras to birds-eye-view, piecemeal recent frames to temporal memory, and radar sensor fusion to vision only; all of those then formalizing into the 360º video with feature queue (temporal and spatial memory). The natural next "rewrite" / refactoring after Beta 11 would be to optimize to better fit computation and time budget constraints, but potentially that should be "easier" in that the training data should already exist by then (instead of changing both the network structure and data collection at the same time).
Using photon counts instead of processed images is a big deal also. They will need to throw away much of their training data, since it is based on processed images, and move to non processed images/video. Or can they de-process the images? I wonder how they are going to manage the transition to higher res images when HW4 arrives in a year. Just upscale the training data?
 
So this was silly. I was driving north on Oregon 99 in Ashland. There are two northbound lanes; we were in the left. FSD announced, "changing lanes to follow route." There was no need for this, as both lanes continued north. It's not as though our lane was shortly going to become a left-turn-only lane. I let it make the lane change. A few seconds after settling into the right lane, it again said, "changing lanes to follow route." And so we scooted back to the left lane, where--you guessed it--it announced, "changing lanes to follow route." Which it did. There was no point to any of these three lane changes. Yes, I sent a video clip.
Or when it moves over to avoid cones into a lane just as close to a different set of cones.
 
  • Funny
Reactions: Oil Freedom
So this was silly. I was driving north on Oregon 99 in Ashland. There are two northbound lanes; we were in the left. FSD announced, "changing lanes to follow route."
Curious, were these unnecessary changes before intersections that had an additional lane for left turns? Perhaps it got confused as map data might have been providing inconsistent data with what vision / neural networks saw ahead. Here OSM incorrectly indicates there's 2 lanes for northbound even though temporarily there's 3 lanes:
or 99 ashland.png
 
Had a strange experience that totally baffled me this morning and only think I figured out what happened post drive.

Set in normal destination this morning and headed out. Because of 2 road closures due to high rise construction I drive manual until I get past and on a side road. Went to engage and NO (mini) Gray Wheel. I crossed over into a neighborhood and still no Gray Wheel. Made a right and it appeared. Double clicked and got (mini) Blue wheel. However car tried to blow by the first easy slow left it has NEVER missed. Disengaged and made turn and then NO Gray Wheel again. Drove almost a ½ mile and right turn. Wheel came back and I engaged coming to a Red light with left. Car haaded for the island and I disengaged and lined up at Red light and reengaged. Light turned Green and it would not go/turn. Noticed the display looked different and the curbs were blue. Had cars behind so no time to think or look and disengaged and drove the rest of the way perplexed.

When I got back in a couple of hours latter Beta worked fine (or as crazy ass fine as normal is 🤣). I think that for some reason Beta was not working and it was trying to use AP since I remember seeing blue "curbs". It was raining/heavy misting but nothing that Beta can't handle.

Not 100% positive what happened but Beta was MIA and if it was Beta the short times I had it on it must have been Alpha 0.01 the way it was driving.:eek::eek:
 
It seems pretty clear from using 10.x, a bunch of things presented at AI Day haven't made it to FSD Beta and Elon Musk most recently reiterated with Lex Fridman that various neural networks and training still need to switch to "surround video," which could also be another "single stack" milestone as instead of having many predictions use custom network stacks that were incrementally derived on the way to the "real world AI" architecture, all would share the same structure and training.
I think we need a separate thread on single stack. It means different things to different people.
 
Curious, were these unnecessary changes before intersections that had an additional lane for left turns?
The first one was, sorta. But it would have been pretty late had the maneuver actually been appropriate. It was changing lanes as it passed through the intersection. So if it had really been getting out of a left turn lane, that would have been driving like my dad used to.

The next two lane changes definitely had no such excuse. Good question, though.
 
That was my experience last night; Autopilot would disengage with the panic red wheel almost instantly; there was some fleeting error on the bottom right of the screen underneath the car cartoon, something to the effect "FSD encountered an unknown error" or similar. After re-engaging it would quit again (DI_a175 error in the notifications) and after few attempts, I'd get the Autosteer temporarily unavailable error too (APP_w207) and at that point the steering wheel grey icon would completely disappear from the screen. TACC would be non-functional too and it would quit after engaging.

After researching some on TMC, I parked, removed the USB flash drive and then the FSD and NoA would work but only for 10-15 minutes before panicking again. Home I've left the USB unplugged and let the car sleep overnight while charging. That seems to have cured the issue for now. See this previous incident too.


The Tesla provided solution seems similar and so far so good. I do not have Sentry normally enabled on a daily basis, just on demand. My 1.25h morning commute was perfect: fully FSD/NoA with no disengagements at all. I am still keeping the USB drive unplugged and I'll update after the night commute as that was when the problems occurred- at night. The USB drive was a 18 months old 64GB Samsung. I did check it and no corruption detected on a deep scan... I am replacing it anyways with a new one... $12.50 to be safe sounds reasonable.

2020 Model 3 LRAWD running 2021.44.25.6 (FSD 10.8)

I'll update later tonight.


IT WAS the USB drive; no incidents whatsoever on a very long commute home this night (thanks to the idiots that can't drive in the rain in SoCal). So if you get FSD/NoA panic alarms try removing/replacing the dashcam USB stick....
 
The USB drive was a 18 months old 64GB Samsung. I did check it and no corruption detected on a deep scan... I am replacing it anyways with a new one... $12.50 to be safe sounds reasonable.
I had to replace my USB drive this Summer after it began throwing lots of errors and even reformatting wouldn't help. Seemed like it failed due to high utilization for so long.

I now keep a spare in the car.
 
Haha yeah… there's been changes for manually fused cameras to birds-eye-view, piecemeal recent frames to temporal memory, and radar sensor fusion to vision only; all of those then formalizing into the 360º video with feature queue (temporal and spatial memory). The natural next "rewrite" / refactoring after Beta 11 would be to optimize to better fit computation and time budget constraints, but potentially that should be "easier" in that the training data should already exist by then (instead of changing both the network structure and data collection at the same time).
And let's not forget the future switch to quantum computing to simultaneously solve for all possible traffic, VRU, road, and weather scenarios. Just picture THAT compute hardware retrofit for our cars! 😉
 
Had this happen to me 1.5 years (and not sure how many revisions ago) as well. (not as bad, mainly just two very badly curbed wheels). Amazing that after all of the updates and "improvements", we are still here..

 
  • Like
Reactions: impastu
This morning I had the same strange experience as I did yesterday morning: Wiki - MASTER THREAD: Actual FSD Beta downloads and experiences

As yesterday it was a light mist to slight sprinkles. Took more notice and pics and seems that this version of Beta (10.8) is very sensitive to weather. As you can see in the first pic and the last pic (pic#4) Repeater that it is not raining much if any. Still I got the "Poor Water Detected". Also in pic #2 there is a Light Settings popup window. Only noticed this when looking at the pics. Must be a v11 feature but have no idea how to get to it. Anyone else noticed this or know about it?

It pulled up to a red light in AP (pic #3) and without me doing anything it switched back to Beta, turned the signal on (pic #4) while waiting on light. Seems like very strange behavior to automatically go from AP to Beta without any warning.

Anyone else experiencing this type of anomalies in NOT bad weather?


IMG_0124D (1).jpeg
IMG_0127D (1).jpeg
IMG_0131D (1).jpeg
IMG_0132D (1).jpeg
 
Last edited:
  • Like
Reactions: impastu