Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
FSD 12.3.3 had difficulty exiting from the interstate today. It got up the exit ramp at an overpass so a short exit window, and instead of moving forward it to exit ahead of the oncoming traffic, it slowed to let a slower entering truck on close in front, then it hesitated and turned off the blinker and gave up. I think it tried again turning on the blinker but it was too late, exiting traffic had moved into the exit lane and blocked us. And it wasn’t a long exit. I was pretty flabbergasted. It didn’t have this problem yesterday same exit, but I think maybe there was less merging traffic.

It did it again at the next exit - slowed to let on yet another slow truck close in front, but at least this time there was no exiting traffic close behind so it moved over and made the exit.

I never slow to let incoming traffic in front of me - they are generally going slower and their job is to get on the interstate behind me while I get off.

Really surprised me and I will have to keep an eye on this. I will probably accelerate approaching the exit to get past cars coming on.
The code have problem with your exit. That is not going to change much even if you try to do it n times. So save yourselves some trouble and turn on the signal to move the car over when you feel save ahead to help FSD a little.
 
  • Like
Reactions: sleepydoc
Cross posting is discouraged here.

To keep it manageable go find the robotaxi thread and post there.
Hi, all --


SidetrackedSue, agreed that the robotaxi thread might be a better place. Do you have a link? I'm not seeing a currently active discussion, but honestly didn't look that hard; using the search w. "Titles only" shows a seemingly relevant poll thread, maybe that would be better? At some level, though, my main topics are, "Can we use the tracker as a starting point?" and, "How rapidly might things progress?"

Mods, apologies, and I of course don't object to your cleaning things up as you see fit.

Yours,
RP
 
So, trip report.

Currently am doing the long run with a MY from NJ to Dallas. This is the end of the second day. With a 2021 MY 12.3.3.

So, as expected, most of the miles have been on various and sundry interstates. But, interestingly, have been getting to and from Superchargers in areas where I Have No Idea Where The Supercharger is located. Or to hotels, likewise.

The MY has the adaptive speed setting set. Contrary to other opinions around here, this does not cause Immediate Death, Destruction, And Explosions Of All Cars (including mine) in the vicinity. Um. The car kept up with traffic, with which there was lots.

On interstates, with the Auto Speed setting set: For whatever reason, the default speed setting appears to be 10% higher than the speed limit. So, at 70 (lots of roads out west with 70, and even 75 limits), one gets 77 mph as the default. Again.. this is keeping up with traffic.

Once again: The car isn't switching lanes into other cars or even attempting to. It does tend to hang out in the left lane a bit on two-lane thataway interstates, but switches back to the right, most of the time, when somebody comes up from the rear. Interestingly, in the middle of Tennessee somewhere, I shifted back to the right after passing some semis. And, after 30 seconds or so, it shifted back to the left with a message, "Avoiding merging traffic". At that point, the nearest exit was some 3 or 5 miles away, so, ?. So much for the, "Stay in the right lane unless passing somebody" rule.

At one point a wobbly semi driver decided that moving a few inches into our lane, when we were more-or-less centered on the body of the semi and there was No Where To Go, necessitating a quick intervention and some shifting left towards the guardrails gave the keeping-on-the-road software the fits. Luckily, the semi recovered itself, but we got stuck with a, "Lane Assist will not be functional until the next drive" message that stuck around for 30 miles until the next stop, at which point it went away. Kind of like a strike, but weird.

As far as rain: We left on a Wednesday. As those of you who get cheap thrills by watching the national weather know, there was a pretty major Northeaster over the East Coast. We drove the first couple hundred miles manually because FSD set the speed limit 5 or 10 mph lower than the actual speed limit; somewhere in the Shenandoah Valley the clouds broke, though, and the speed limits came back up to normal.

In light rain; fine. In spray kicked up by semis, not so fine.

Did have some dry wipes. And then thought about it: In SW Virginia and Tennessee, we're getting these clouds of gnats. Drive through one at speed and, Splat! Instant multiple insect goo.

Cleaned it all off and the dry wipes stopped.
 
My FSD trial so far is going pretty good. Have not done the highways yet. On the backroads it does everything, except unsupervised lefts across 3 lanes in moderate to heavy traffic. It waits forever for all 6 lanes to be cleared.

I was amazed how it handled the cars that cut you and cross your path perpendicularly (coming from the opposite direction and turning left) . The Autopilot typically does a hard brake, and that too after the car has already crossed your path. But FSD though does exactly like what a human would do. It gently slows down and then gets back to original speed. If there is enough distance between you and the offending car, it does not slow down at all. This is a huge improvement from how Autopilot reacts to this situation.

Every right turn, left turn, maneuvering around the parking lot, lane changes are all smooth. So much that my wife figured out the car is driving almost only at the end of a 10 mile drive !! :D Now that is a huge victory.

Elon you legitimately stole my $200 per month
 
I got FSD disabled warning three times already (because I was not looking at the road, but on the screen and it didn't like it). I only have two more strikes left.

Question: What happens if I exhaust all my warnings? Will my FSD get disabled permanently and thats it? No recourse after that?
You have a good amount of time to shake the steering wheel or move the scroll wheel if it sees you not looking, you must not have noticed the screen flashing for 10 seconds?
 
Defending or debating this is admittedly not very easy from the outside. By that I mean two things really, that we are outside of Tesla and also, for most of including myself, outside of the expertise of this rapidly developing field of applied AI technology.
True that we can't know what's happening deep inside Tesla on this front. But we do know that the overall strategy seems very driven by what Elon wants to do, and that he has been consistently and predictably overoptimistic about many aspects of FSD's progress over the years.
We really don't know how much data and what kinds of data Tesla needs at this stage. It seems like they're past the initial phase of simply needing positive examples of good driving, in various environments and traffic scenarios. If that positive reinforcement method were all they had, then "unlearning" bad behavior would be primarily accomplished by carefully curating and removing any unintended bad examples in the training data, combined with massive quantities of additional positive-reinforcement example training data. In fact, that's kind of what Elon suggested when he and Ashok published that 2023 live stream video in which the car did fail at a turn-arrow traffic light.
I agree that they do need to find ways to do negative reinforcement, and that there are limitations to purely positive reinforcement. Whether that needs to be done (or can be done) by a single end-to-end neural network is an open question, or whether some other architecture will turn out be necessary, like an ensemble of neural nets, or a (temporary) return to hand-coding for some of the more common failure scenarios. We will see.
In that prior context, it wasn't clear what role the disengagement data would play, i.e. how these negative training examples could be used to teach what not to do. I haven't been searching this much, so I'm probably lacking in my knowledge of popular explanatory links that may be available right now - but AFAIK the current rumors are that Tesla now wants disengagement data. So something has changed in that respect, since the live stream video demonstration and commentary.
For the long tail, yes. But assuming that FSD success is possible, it simply can't be the case that more data is necessary for the common cases. Because if it is, it implies that in order to solve the long-tail failure cases that are 100x or 1000x rarer, Tesla would need fleets of literally BILLIONS of cars driving trillions of miles to gather enough data to solve them. Besides, Tesla has excellent internal mechanisms to generate synthetic data examples for training. Once they can identify a specific reproducible weak point, generating thousands of synthetic examples for it has got to be enough.
If all they really wanted for further progress was a continuing huge source of human driving data, drawn from all over North America and beyond, well that's available already from the majority of Tesla cars that don't run FSD - and in that case the nearly-universal free trial would then be entirely counterproductive to progress! Unless of course, we come back to the theory that Elon actually thinks it's time to make big bucks from people so convinced of FSD greatness, that they will subscribe or purchase in huge numbers...
I think it's partially Elon's skewed view of how good FSD really is or isn't (as judged by a first-time experiencer), and partially desperation to try to distract away from the short-term weak financial picture. (In the medium to long term I think Tesla will be fine.) Again, if it truly is a case of not enough data to solve the common failures, then FSD is screwed, because then they can never have enough data to solve the rare cases.
No one here believes that kind of take rate and financial boon is really likely for FSD (Supervised) in today's form. Nor is the theory of an attempted EOQ stock bump at all convincing; Elon just doesn't think that way no matter how much people like to claim it. And even for those who bear ill feelings towards Elon, I think it's a very uncompelling argument that Elon is dumber endless informed about this than all the rest of us.
Elon is brilliant, but also is hopelessly overoptimistic about timelines and rates of progress. (And IMHO, judging by his non-technical posts on X, utterly misguided about all sorts of things.) I think in his mind he lives a few years in the future, and conflates it with the present. Or at least, he conflates the 5-years-away future with the 3-months-away future. I do think FSD will be astoundingly good in 5 years, as far as initial impressions for first-time drivers. But it is not there now.
 
FSD had already turned on the turn signal as it approached the exit. Two days ago FSD handled the same exit perfectly.
Sorry, I thought you were not on the exit lane already.

But, if you are talking about one of those exits that cut across an entrance merge lane, we have one of those on I-5N exit to 152 W. I always disengage to take that exit manually. That type of exits are just too tight and I never trust FSD.
 
The more things change, the more it stays the same? No, perhaps even worse. And in spite of all the time and money Tesla spent on that turn. Chuck thought it regressed a bit.

Yeah it's exactly what I predicted. Quote below. I'm not surprised. Not my first rodeo!

The beer is mine (4/7 or 5/7 depending on whether you count driving at 25mph on a ~45mph road a failure), though I maybe will not collect it if Chuck does less than ten attempts (@Daniel in SD and I will have to examine the fine print of this rule in the 10.69 thread). It's a moral beer if nothing else. The best type of beer. 🍺

Do remember that 12.3 and 12.3.x does not incorporate any additional unprotected left training - that all occurred after the 12.3 release (before 12.3.3 but that isn't relevant since it doesn't appear to have been retrained).

NEXT time I will finally lose. I am betting against FSD 12.4 as well. Any perfect performance with 7 or more attempts is ok by me. The march of the 8.6s to 10.0s.

In any case @Daniel in SD let's just say I'm not worried about losing my beer bet! Left me just sitting in the middle of traffic lanes from the left twice (there was no one coming). Literally just stopped, or perhaps moving at less than 1mph.
 
  • Like
Reactions: RabidYak
I kind of feel like people have doubts sometimes but I have literally never made anything up here.
I think you perhaps have different views on what constitutes acceptable but I don’t think you make things up. See my previous post about differing experiences.
I maintain the two modes have no differences other than the target speed.
I haven’t spent a lot of time with the Automax setting, but the little time I’ve spent auto max compared to normal setting tells me they are not the same.
 
The car kept up with traffic, with which there was lots.
On the freeway the old v11 behavior is in effect, just FYI. It will go to your set speed, that is what it does.
default speed setting appears to be 10% higher than the speed limit.
5%/10%/15% depending on driving mode chill/moderate/assertive.
"Lane Assist will not be functional until the next drive" message that stuck around for 30 miles until the next stop,
Normal message, old behavior. Happens if you don't signal and then drift across any of the road lines enough times. Happens all the time when trying to pass on rural highways and poking out to get visibility (have to remember to turn on signal even if you're just looking and not going, which is not optimal).

It's not a strike, it's just feature disablement. Trying to prevent abuse of the feature as a poor mans autopilot or something. The disablement is stupid as far as I am concerned.

but the little time I’ve spent auto max compared to normal setting tells me they are not the same.
They're not the same - the max limit is different. But let me know what differences you observe if you ever get a chance to look into it.
 
  • Like
Reactions: primedive
What on earth makes you think that? I've been hearing some version of "it's going to be awesome in 3-6 months" for several years now, and then it's not, and then its all about the next big release or platform change (HW3! DOJO!!! Nothing but nets!!!)
To be clear, by "huge improvements", I don't mean perfection; I mean that I expect these sore-thumb failure cases to be improved to be roughly on parity with the other things FSD currently "does well". This is still many orders of magnitude away from L4. The reason for my optimism is that I believe the v12 end-to-end approach has far more potential for improvement than the v11 hybrid hand-coded approach, perhaps even to the point (after several years of refinement) that reliability becomes hardware-limited instead of software-limited. Still not L4, but "useful" L2, meaning that it is less mentally taxing to drive with it turned on than with it turned off, without compromising safety.
It needs to go several lifetimes with no mistakes, statistically. [...]

Average human goes 200k between accidents. If Tesla releases FSD at 51% of average human accidents, then it will be 1/3 as good as me. I refuse to use something that could triple my accident rate.

What the actual F--- are y'all smoking? This is going to seriously hurt or kill someone, and needs to be shut down.
For an L4 system, yes. For an L2 system, no. FSD is firmly still L2 of course, and I don't expect it to reach city-streets L3 for another 8-10 years, with HW5 or HW6. (Highway L3 is maybe 2-3 years away, on HW4.)
It needs to effectively never fail or it's useless. What am I supposed to do if I think it's making a mistake? If I can't see a reason for it pulling off the road into the ditch, it could be 2 things:

1. Saving my life from an imminent collision I can't see.
2. Be about to crash into the ditch and kill me.

What do I do?
Until the car is L4, trust your own judgment over the car's judgment. As you said, you've driven 600k miles without getting into an "imminent collision you can't see". If the car does something obviously dangerous, then of course override it. But also, if the car's unexpected behavior isn't causing an imminent threat (e.g. if it swerves into an adjacent empty lane), then there's no need to immediately override it, and it may be saving you from something worse. So it's possible to have it both ways, and the combination of FSD + driver can be safer than either separately. Obviously you can construct hypothetical trolley-problem situations, but in practice those are vanishingly rare. The statistics, and how we think about FSD, should be informed by real-world cases.
 
  • Informative
Reactions: OxBrew
My car sometimes stays an entire car length or more behind the next car at an intersection or half a car length behind the line at a stop sign.
Yeah, ridiculous. Who does this, and why are people saying stops are normal and well executed? Do they care about their credibility? For some people is this truly not happening (that would be interesting)? Or do they think this is how people drive (look around, it's not, people are usually within about 2-5 feet of the line, not 5-10 feet)?

Exhibit A (this is a good result, by the way; it was probably only 4-5 feet behind the line at the "NHTSA stop" (FYI: this is not an NHTSA stop)):

Rant: Why do people keep saying NHTSA stop? It's so infuriating! NHTSA literally has absolutely nothing to do with this!!!! Tesla needs to stop to 0mph at the line. It's not a freaking big deal. What they are doing in this picture is ILLEGAL (if there's a bicycle in front of the car you need to stop at the line). You need to stop at the line - that's what the whole NHTSA recall or whatever was about!

IMG_0648.jpeg
 
Last edited:
Completely agreed. I can't help but feel v12 is being rushed to the public prematurely and speculate why. I think tesla is growing a reputation problem atm.
I think their fundamental problem is that Elon [and thus Tesla] persists on representing FSD as "weak L4" rather than "strong L2". (He seems to be psychologically incapable of doing otherwise, to his detriment.) And I do feel like v12 likely has huge improvements in its immediate future, which makes it all the more puzzling why they are pushing it NOW, rather than a few months from now. I'll be really curious to see where it's at in a year. Still expecting nowhere near L4, but a lot more robust L2.
 
"Close to curbs" is not curbing the wheels... How far does it need to keep from curbs? Is one inch enough? 2? 4? 6? A foot?
In a statistical sense it is, though. The car's positioning is not exact; it follows roughly a bell-shaped curve, and when it hits an unlucky outlier case, it will curb the wheels. Suppose the standard deviation (expected error) in the car's positioning is about 3 inches. Aiming for 9 inches from the curb will curb the wheels about 1 in 1000 turns; aiming for 18 inches from the curb will curb the wheels just once in a billion turns. This is a simplification, but really the logical approach should be to figure out what probability of failure is acceptable, then figure out what the expected positioning error is, and set the target path based on that.
 
Last edited:
In a statistical sense it is, though. The car's positioning is not exact; it follows roughly a Bell curve, and when it hits an unlucky outlier case, it will curb the wheels. Suppose the standard deviation (expected error) in the car's positioning is about 3 inches. Aiming for 9 inches from the curb will curb the wheels about 1 in 1000 turns; aiming for 18 inches from the curb will curb the wheels just once in a billion turns. This is a simplification, but really the logical approach should be to figure out what probability of failure is acceptable, then estimate what the expected positioning error is, and set the target path based on that.
It's interesting to me how robotic (in a sense) v12 is. In the absence of differences in the environment that would matter (parked cars, pedestrians, other vehicles), it seems to drive (to the ability of my eye to measure) EXACTLY the same line in my neighborhood every time. It's way more consistent than the line I take.

I should paint some lines and take video and see what is the standard deviation on the line position variation.

But yes to your point there is definitely variation.
 
  • Like
Reactions: OxBrew and Ben W
It's interesting to me how robotic (in a sense) v12 is. In the absence of differences in the environment that would matter (parked cars, pedestrians, other vehicles), it seems to drive (to the ability of my eye to measure) EXACTLY the same line in my neighborhood every time. It's way more consistent than the line I take.

I should paint some lines and take video and see what is the standard deviation on the line position variation.

But yes to your point there is definitely variation.

Does that come as a surprise to you? It’s a machine learning model. If you feed the exact same input into it you will get the exact same output.