Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
The detected release notes for 2024.3.6 on TeslaInfo from a Model S indicates 2024.3.6 / 12.3.2.1 includes Autopark where you use the right scroll wheel to select the targeted space then push the scroll wheel to start.

Display on the IC, right scroll wheel to select your desired parking space.
Thank you, I can’t believe there’s no option to move this and high fidelity park assist to the main display, where it can manipulated in a useful way.
 
like we never got pushed to the 2024.X branches because 12.3 was on a 2023.X branch
That might be over now with TeslaFi showing at least one vehicle upgrading from 2023.44.30.8 to 2024.3.6. Teslascope also shows a pending install of 12.3.2.1 for a vehicle also on 2023.x. Although those who have the video snapshot button will probably still be treated differently.
 
  • Like
Reactions: willow_hiller
There is a section of highway that FSD uses V12 on before the 4 lanes split up onto a local road and one back onto a highway. Today there was heavy traffic and I think I figured out why they revert to V11 for highway.

There was this very odd behavior where the lead car would roll away and V12 would creep at like 3mph and slowly drive into the lane line then center and speed up to the lead car after like 30 seconds to a minute and repeat.

I was trying so hard to figure out WTF it was doing until I realized there was another car nearby doing the same exact thing...... BECAUSE THEY WERE ON THEIR CELL PHONE.

Seems like V12 learned some unwanted highway behavior from distracted drivers.

I don't blame the model either. I'm sure there were bored Tesla drivers in stop and go traffic on their phone and it got used as training data. The model doesn't know the behavior was because they were distracted and straight up imitated them. Wild.
 
Last edited:
It would, but only after being trained on millions of examples of good drivers in that situation. It is end-to-end, right?

An end-to-end architecture doesn't mean it's only trained on examples from real drivers. Near accidents and the final moments before actual crashes are great examples of things that are nearly impossible to collect real training data for:

- Humans don't respond very quickly so they probably don't actually perform well in near accident situations requiring fast action
- Humans can't see 360 deg and evaluate all evasive maneuvers
- There's a limit volume of these events

My guess is that Tesla either supplements with synthetic training data or those actions are still handled by explicit C++ code instead of neural nets. Auto emergency braking is still active when FSD v12 is driving.
 
Got home late today and in the last hour had one heck of a download and upload on my car. IMG_2633.jpeg
Currently 12.3, HW4
 
  • Like
Reactions: rlsd
Why do you let FSDb drive in the parking lot? Which usually have very sharp curb angles! Have you been following posts here which already mentioned that 12.3 have a tendency to cut corners? From my experience 12.3.1 do a little better, but I would still watch out on normal right turns and certainly not let 12.3.1 to drive in the parking lot.
For the past few days I've done a ton of large parking lots like Walmart and Home Depot with straight forward lanes. No problems at all. A little slow of course but FSD awareness of pedestrians is probably better than mine due to all the cameras. So the drives are a bit safer in that regard. No crazy sharp curb angles.

In general I now expect every drive to be zero disengagements and I think I've had one disengagement in the last 20 drives. Like others slowness is a problem with V12.3. I decided to change my approach to see if FSD would reach it's destination, like Waymo does so avoiding the need to disengage most of the time. Having used FSD for 2 1/2 years I have a pretty good idea what is safe.

I'm driving in metropolitan Boston but not in urban settings which may explain why FSD is doing so well for me. Hardware 4, Model Y. Tomorrow I have a 150 mile drive, mostly on highways and except for one mapping error I full expect the drive to be zero disengagements and zero interventions. Pretty simple drive.
 
It still hasn't been trained on emergency vehicles. It likely just identified a vehicle that was moving fast and not going to stop at the light.
I wish I saved the video because it felt much more than that. The emergency vehicle was moving slow but it did have to go around vehicles into oncoming traffic. But that was also slow because of heavy traffic. In fact it was so long that the signal turned red.

If I had loud music playing I probably would have intervened and pressed the accelerator. In fact if I had it was far enough away that it would have been fine. However, if I moved forward the vehicle behind me may have followed in error.

I suspect it was NOT specifically trained for this but picked up enough cues from other training sequences to determine the correct action.
 
I’m installing 12.3.2.1 (2024.3.6) on my 23MYLR as we speak. Will be my first time on v12, I’m excited to see what the improvements are like. It’ll be interesting as I’m currently 10hrs away from home on spring break with the family. We drove down using FSD v11 last weekend and will be driving back on FSD v12 this weekend. We’ll have lots of road time to compare!
 
Why do you let FSDb drive in the parking lot? Which usually have very sharp curb angles! Have you been following posts here which already mentioned that 12.3 have a tendency to cut corners? From my experience 12.3.1 do a little better, but I would still watch out on normal right turns and certainly not let 12.3.1 to drive in the parking lot.
While it certainly cuts corners and/or cross lan lines sometimes, I've never had it get too close to a verb. And so far it's been fine even in tight parking lots (with, as you note, tight curb turns).
 
I'm driving in metropolitan Boston but not in urban settings which may explain why FSD is doing so well for me. Hardware 4, Model Y. Tomorrow I have a 150 mile drive, mostly on highways and except for one mapping error I full expect the drive to be zero disengagements and zero interventions. Pretty simple drive.
Dont forget that highway is still using the 11.x stack.
 
  • Helpful
Reactions: JB47394
An end-to-end architecture doesn't mean it's only trained on examples from real drivers. Near accidents and the final moments before actual crashes are great examples of things that are nearly impossible to collect real training data for:

- Humans don't respond very quickly so they probably don't actually perform well in near accident situations requiring fast action
- Humans can't see 360 deg and evaluate all evasive maneuvers
- There's a limit volume of these events

My guess is that Tesla either supplements with synthetic training data or those actions are still handled by explicit C++ code instead of neural nets. Auto emergency braking is still active when FSD v12 is driving.
Yeah, I’m not sure if Tesla has code exterior to the end-to-end net, but it seems needed for unusual situations.
 
Was told to open a service ticket under Autopilot option and to mention the curb issue.

Tesla service is pretty limited so I feel like the advisor was just BSing. These people will get on their knees and promise that god will personally hand fix your issue if you just go away and let them continue playing candy crush. Honestly, same.
I opened a ticket under software update. Not sure where that went. Candy crush delete 😄. They may contact me tomorrow.
When I drove again this eve the car was too close to the curb. Decided to leave the safe driving to my hands.
 
*sigh* yes, you’ve said that before and others have pointed out that other laws actually reference the SAE standards.

The SAE does not write laws, but those who write the laws reference them so SAE standards do become codified. That’s actually what you want - the lawmakers let the experts do what they do best and then reference them. It’s really not hard to understand unless you want to be obtuse and excessively pedantic to avoid admitting you’re wrong.

In any case, pointing out that someone is wrong is not bullying, it’s simply telling the truth, and it’s more than a bit ironic to claim everyone else is insane when you flail in the face of rationality.
 
Commute: ~8 miles, mostly very light traffic

~12 disengagements (~9 for traffic lights (red) or stopped traffic; inability to ease off in time. 3 for curb proximity, unmarked residential streets)

Unknown number interventions (not really important).

v12.3

This is a really supremely tough route because it is extremely fast (55-60mph or so is where most disengagements occurred, though some were at 45-50mph stops). On the other hand, visibility is often around 1/3 of a mile, so predicting light behavior is easy. Yellow lights are super long so no issues there.

Reports filed for all disengagements.

Gonna be pretty freakin’ awesome once they figure out how to slow down. (Been waiting for 2.5 years.)
 
Last edited:
  • Like
Reactions: RabidYak