Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
FSD seems to have a problem with wider shoulders. On a couple of streets, it tries to drive on the shoulder assuming it’s a lane - sometimes even straddling the “lanes”.

Since it’s not something humans would do, how did it learn that behavior?

Can confirm. I live in a city with very wide bike lanes and v12 has attempted to use them as driving lanes on multiple occasions, particularly when making a right turn onto a street with a wide bike lane.
 
Based on Tesla's message (Full Self-Driving (Supervised) is enabled on your vehicle), I think that the update enabled this without my knowledge. So be aware of this if you have multiple profiles.

Does Tesla enable this without you having to confirm the legal liability disclaimer?
Yes. This unapproved FSD enabling has been reported here by a couple others, including me. In those profile, if you disable and re-enable FSD, it does require the confirmation.

Rather than FSDS I guess it is SEFSD, Self Enabled Full Self Driving. Cleary a bug. I expect the Tesla legal eagles threw a fit, as this might arguably shift liability for an FSD accident to Tesla, oops. I wonder if 12.3.4 included a fix. Anyone?

The one profile on our MY which did not get SEFSD was Easy Entry, which makes sense, because that one is switched to when in park, so FSD would be superfluous. However, this also means that whoever coded SEFSD was paying at least paying attention to which profiles got FSD enabled, but forgot to have it check the already enabled flag.
 
FSD seems to have a problem with wider shoulders. On a couple of streets, it tries to drive on the shoulder assuming it’s a lane - sometimes even straddling the “lanes”.

Since it’s not something humans would do, how did it learn that behavior?
Every situation the car encounters (or that a human encounters) is at least subtly different from any situation ever encountered by any car or driver before. The neural net must extrapolate from its training set what to do in this new situation. With the removal of the intermediate C++ layer, it's likely that there is no longer an explicit "lane" vs "shoulder" label anywhere in the architecture. The car usually gets it right, but sometimes gets it wrong. (I've seen this mistake a number of times myself.) Modifying the NN training set to include more lane-vs-shoulder examples should improve the reliability for this particular failure mode, but it's unclear whether this approach can scale (within the constraints of HW3 or HW4) to sufficient 9's of reliability (even for this specific failure mode, let alone all failure modes) to enable safe L3 or L4 autonomy.

So it's not that it actively learned or was trained to do a wrong or non-human thing; it's that the training was insufficient to have it mimic human driving comprehensively enough to always do the right thing. It will be interesting to see how quickly or slowly this reliability improves over the next few software updates.
 
March of 1s has started. 1 in 10 times it has made the weird UPL I take almost every day. 9 out of 10 times it either disengaged (half the time) or pulled out in front if a car coming from the left and would have crashed.

Even the 1 time, I don't give it a pass, there was no one coming. Does not count in my book.
 
I was driving so I assure you she was in the street when gesticulating - and in the second image below.
This is also extremely clear from the video! You can see her take a step or two forward. Watch her position relative to the post; she's taken one to two steps into the street (which you can see her taking). Even with the car changing position you can see that has to be the case - imagine where she would be if she had remained stationary, in the second image. (She'd be three feet to the right, under the red hand, or possibly completely out of view! Remember stationary things in the foreground appear to move faster to the right than things in the background when moving across a scene from right to left. It's literally impossible for her to be in this position if she had not moved.)

View attachment 1038427View attachment 1038426
Thanks for the clarification but that's not "striding aggressively".
 
  • Funny
Reactions: AlanSubie4Life
Very weird behavior yesterday when I drove from Cleveland to Erie it was ping-ponging on the freeway like crazy. On my way back today it’s very stable within the lines with just a little bit of ping-ponging. Any ideas on why? Now that I think about it, it could’ve been the wind blowing the car around in the car just reentering from the wind.
V11 behavior but have you had this happen on the V12 stack?
 
  • Like
Reactions: FSDtester#1
I have two profiles, Al FSD and Al No FSD. After some surprising behavior, I checked and found that FSD had been switched on in the latter. No way I did that.
The recent release note saying "FSD has been enabled in your car" may have something to do with that?
 
Yes. This unapproved FSD enabling has been reported here by a couple others, including me. In those profile, if you disable and re-enable FSD, it does require the confirmation.

Rather than FSDS I guess it is SEFSD, Self Enabled Full Self Driving. Cleary a bug. I expect the Tesla legal eagles threw a fit, as this might arguably shift liability for an FSD accident to Tesla, oops. I wonder if 12.3.4 included a fix. Anyone?

The one profile on our MY which did not get SEFSD was Easy Entry, which makes sense, because that one is switched to when in park, so FSD would be superfluous. However, this also means that whoever coded SEFSD was paying at least paying attention to which profiles got FSD enabled, but forgot to have it check the already enabled flag.
I'm not feelin' it. It didn't just go wide like that, there'd have been time to see the "Oops" and correct it before wide release. My take is that it was purposely enabled for everyone as either some sort of brazen display of confidence or as a strategy to have more people experience it even if they're the types (like my wife) who'd never enable it on purpose while it's still as persnickety as it is.

Just the vibe I get.
 
  • Like
Reactions: FSDtester#1
Thoughts on the junk now being $99/month?

Our purchase price of $10k could’ve bought us 101 months of the subscription. That’s 8.4 years, likely longer than we’ll keep our current cars.

#gotmusk’d
Same thoughts I had when I bought our first 65” TV for $6K. Who would have guessed technology goes down over time but I was an early adopter so not expecting a refund.
 
Thoughts on the junk now being $99/month?

Our purchase price of $10k could’ve bought us 101 months of the subscription. That’s 8.4 years, likely longer than we’ll keep our current cars.

#gotmusk’d

Simple solution: keep the car for longer. The longer you have the car, the more you profit!
 
Thoughts on the junk now being $99/month?

Our purchase price of $10k could’ve bought us 101 months of the subscription. That’s 8.4 years, likely longer than we’ll keep our current cars.

#gotmusk’d
Two thoughts:
1. With the concept of FSD transfer to a new Tesla (which many of us had been requesting for years), your purchase doesn't have to expire when you get rid of the car.
Tesla still needs to make the transfer a normal thing instead of a sales-boost premium, but it's slowly moving in that direction. I never understood why they wouldn't want to do that anyway, for customer-retention reasons.​
They could prorate it so that you get more credit if you transfer sooner and/or if you paid more. This would also allow them to play with the price without looking like they're completely shafting early adopters.​
2. The FSD purchase presumably entitles you to FSD (Unsupervised) whenever that happens. I know that's a big joke around here, but it isn't to me. I think I'm going to need it, on this car or the next.
 
Also the same amount of time you spent graduating 🎓 high school.
But hey, you made it!
1713124226072.jpeg

Same thoughts I had when I bought our first 65” TV for $6K. Who would have guessed technology goes down over time but I was an early adopter so not expecting a refund.
Are you seriously comparing a depreciating old TV to an evolving “full self driving” feature that has yet to be fully completed (unsupervised) as advertised?
 
Can confirm. I live in a city with very wide bike lanes and v12 has attempted to use them as driving lanes on multiple occasions, particularly when making a right turn onto a street with a wide bike lane.
I've had the same issue as well, there's one particular right turn I often take where FSD will constantly turn into the shoulder (which ends after another block) instead of the actual road lanes.

I wish they'd use some special case code for certain locations that it always gets wrong, like if there's more than a certain amount of disengagements at the same spot someone looks at it and manually corrects the metadata there.

Aside from the aforementioned turning onto the shoulder, and one other similar issue (turning into a parking lot just before the freeway on ramp), FSD handles my regular commute flawlessly, so it's annoying that two small spots routinely fail, and get reported constantly to Tesla.
 
Neural networks learn things in different ways, and it's not surprising for 12.x to have misunderstood the behavior. End-to-end has needed to learn to make a turn on green… except yield when there's a pedestrian… except continue if the pedestrian hasn't moved for a while… except if the walk signal just changed. This could also be complicated by the variance of how quickly pedestrians respond to signals, but presumably something similar happens at stop signs with pedestrians waiting for an opportunity to enter, so there could be potential for shared learning signals.

Potentially the current neural networks already have an internal perception and understanding of walk signals, but control predictions don't give enough weight to that information. The training process to adjust the weights to pay more attention to walk signals could boost the value from say 10% to 50% does not change the size of the network, so it doesn't affect the runtime compute and memory requirements of the system. Hopefully "just" more disengagements and training data should address this issue.
So weird though - why did it go to 8mph without stopping at the line? I do often press the accelerator (otherwise I would have been much closer behind the lead car), but I didn’t here, because all of a sudden (after just lazing about) it was off to the races!

As a human, you proceed to the line quickly, stop, then if the light still hasn’t changed, go, leaving plenty of margin to the pedestrian (tricky here because there are two right turn lanes). And if the light turns, you don’t go!

In all cases you exhibit a bit of caution because it is obvious there is a pedestrian waiting to cross.

I can't see any way to fix this issue unless it has instant reaction time. If it can't anticipate, it has to be able to react immediately. 0.5 seconds is not going to cut it.

By the looks of it on X, Chuck's left turn seems to be yet another fail. It doesn't even use the median at all.

12.3.4 is a regression.

No, this is not a regression - it's hard to regress on something that was failing before. This is the right way to do the turn! You should not use the median (who does this (???), it's borderline insanity).

6/8, scoring extremely generously (if you count the not changing other vehicles behavior - a key part of the scoring system - it's more like 4/8 but I didn't pay attention to detailed scoring). About the same as before. (I think 7/9 if you count the trial run on the prior video, even though it made traffic move on that video from yesterday, so that would be a 0/1. So ~4/9 per proper scoring.)

This will be the start of the march of 9s, now that they're using the right strategy of rolling every turn.

All they have to do is increase assertiveness to give 0.6-0.9g of acceleration, fit comfortably through 4-5 second gaps, and they'll be fine. They also have to fix the creep behavior. This is the way.

I was once again right about the failure of 12.3.4. Unfortunately no beer scored because @Daniel in SD is scared and has no faith and is waiting for the arrival of FSD Beta.

I don't think any retraining has occurred here, they've just altered the caution thresholds (which are somehow inputs to the NN). I've noticed in my left turn today that 12.3.4 was MUCH more conservative on traffic from the left. I'd have to check the video but it looked like it waited on traffic 6-7 seconds away. I don't think I've seen it quite that conservative before. However, other than that, it drove the turn in an identical fashion. I haven't seen it stop in the road recently but I'm sure it can. It seems this happens when it's insufficiently aggressive and starts waiting on traffic from the right after committing. That's why they need 0.8g of acceleration. If you don't use it you lose it.

v12.4.x will also fail, at least initially. (I think maybe I'll finally be wrong! It depends on whether they incorporate new training. I actually think that end-to-end is surprisingly good, better than I thought, and maybe there is more hope for training on this sort of problem than I thought. They just need a bunch of training data from people driving the turn assertively, not waiting in the median, and consistently hitting 30mph by the apex (currently at 15mph or so). I originally thought that this problem would be way too complex for E2E, but maybe if they can simplify the trained approach there is hope.)

Good news, no regression. Bad news, still have never successfully met the unprotected left challenge.

Fix the creep! And stop at the stop line. And fix the USB (this key reason for 12.3.4 seems to have also failed to work).
 
Last edited: