Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
TeslaFi says there are over 6,000 on 12.3.6, and only 600 on 12.3.4 plus a few smaller number on earlier 12 variants. There are still nearly 1,000 on the 2023.44.30.x 11.4.9 versions, and I wonder why. Perhaps not ready to chance version 12?

I am curious about the "free trial" folks. Did they just enable 11.4.9 or one receive or wait for one of the V12 variants?

I guess I'm wondering when FSD 12 will replace 11.4.9 in the production branch, and it the V11 highway version will be replace with V12 code.
@swedge
done the "first month free, into second month $99 + tax"
in midst of a number of 2,000+ mile round trips east coast of US, Floriduh ("hurricanes & floods & bankrupt insurance companies, OH MY")
to Washington DC area (intense heavy traffic with, mild calm traffic)

12.3.6 in chill mode

I find mostly relaxing except abrupt lane changes occasionally, smooth to suddenly a 'jerk', waking spouse, she is still hesitant but less so.

(I've driven these roads for over 58 years so I'm mostly familiar)
one observation is that when an exit is coming up in heavy traffic, I get in exit lane , the wheel does it's "shudder" to alert me it wants to move to center lane, and I have to override with "i'm staying in lane because I'm exiting a mile ahead and don't want to force way back into exit lane through angry hostile commuters"
'"why did you override? I know the roads better"

FSD(S) 12.3.6 handled this interchange fairly well (Springfield Virginia, USA just south of Washington DC), only 2 "shudders" of "i'm staying in this lane because I know the area better.

You stay left,_if you want to go right (east) (I-95/495/Woodrow Wilson Bridge)_ , and _right_ if you want to go left (west) I-495 ! approaching from the bottom FSD(S) did both on 2 separate occasions
1715088841691.png


as an aside

I'm unsure how FSD(S) would handle several very edge cases.

Almost all cars I have seen with Diplomatic tags are decent drivers but almost got run off road by one who just "merged" expecting me to dodge (i did) as DPL tags have "get out of jail free cards" (this was 40 years back)
I did have shoulder to dodge into/merge over to

(this also happened with a gasoline tanker so I avoided that companies tankers, as IMHO felt they were not safe and if memory serves a few years later, one of them hit bridge abutment at Interstate 495/270 merge overpass October 1992

Another very edge case years back was near Tacoma, Park, Maryland USA METRO station, no street lights and had been reports of car jackings in that general area.
I stopped at stop light, got rear ended. lightly. (also perhaps 30-40 years back) (no other cars around and dark, no street lights, anything)
(this was pre air bag but tap was light enough to not set off air bag if I had had ones but hard enough to be "i got hit/rear ended)

(The angels of my better nature were screaming at me to skedaddle, leave, etc so I did)

I took off to next light. as did the other driver, there the driver hops out of car, runs up and says "i hit you, aren't you going to do something?"

I hollered, "YES", and I floored it running the red light and left, thus possibly saving my life and car, leaving him and possible buddies standing there
how would FSD(S) handle that edge case?
 
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.

In its request for information, NHTSA said a preliminary analysis identified at least 20 crashes in Tesla vehicles equipped with the updated version of Autopilot. Of those crashes, nine involved Teslas striking other vehicles or people in its path — “frontal plane” crashes, in the agency’s parlance. These crashes seem to imply that Tesla’s camera-based vision system is inadequate at detecting some objects in front of the vehicle when Autopilot is engaged.

 
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.



O Lord in Heaven, I perceive the machinations of those in power, who, under the guise of regulation, seek to impose stricter controls upon Tesla, rather than elevating all automobile makers to its standards of data collection. They shall wield this data, interpreted through the lens of their own prejudices, to impose further burdensome warnings upon Tesla vehicles. This, I fear, is but a stratagem to grant their less scrutinized allies in the industry precious time to draw level.

When thousands perish due to distraction in conventional vehicles, these regulators turn a blind eye; yet, as Proverbs tells us, "A false balance is an abomination to the Lord," for when a few meet their end in automated cars, their outcry is as thunderous as it is unjust.
 
Last edited:
I didn't notice any Autopark improvement between 12.3.4 and 12.3.6. You mean it shows more parking spots? Is this confirmed?
You have to be going 8mph or less and you'll see all the spots start appearing. One will have the P in it, but the others will be grayed boxes. If you pick one of the grayed boxes, right or left side of the parking lane it will drive to it and angle away from it before backing into it.

Whatever HW3 chip was when they first rolled it to 2018 FSD cars is the chip I have.
 
  • Helpful
Reactions: JB47394
Yes, but you aren't getting the vision park assist.
It's a little complicated on this front. We (Atom) don't get the high fidelity manual park assist as an option, and no such visualization that I've seen yet. However, I've noticed that when I use the new auto-parking, it does have some novel rendering going on that looks like a grey blobby representation of their occupancy data, which is basically a weak version of HFPA, but only while autoparking.
 
It's a little complicated on this front. We (Atom) don't get the high fidelity manual park assist as an option, and no such visualization that I've seen yet. However, I've noticed that when I use the new auto-parking, it does have some novel rendering going on that looks like a grey blobby representation of their occlusion data, which is basically a weak version of HFPA, but only while autoparking.
Yes just flat and gray instead of 3D and color. Plus we can't spin the screen around to "see" other things.

IMG_4908.jpeg
 
It seems to me that FSD will probably stay at the L2 level for some time. With regular improvements, I'm tempted to think we're getting closer to L3 or L4, and I think this is where a lot of irritation comes from. I think Tesla wants to have it both ways by making robotaxi claims when the system we have now requires constant supervision.

Isn't this the crux of the issue? We want it to succeed, and we want to be able to take our hands off the wheel and look around. I do. And so I've found myself day dreaming and really not focusing while the robot's driving. But this is exactly the time I should be watching and expectantly waiting for those situations where this system has trouble. And immediately disengage.

Merging on the freeway is unnerving, especially into an exit lane on the freeway, where the robot jams us in between other cars. I really wish it did better in this situation so I don't have to try to see everything at once in order to do it myself. We have the freeway exits positioned right after onramps, so cars are competing with the same lane. I was surprised on the freeway when it took too long to slow down and suddenly hit the brakes. Come on! I can see traffic slowing ahead. Picking the wrong lane ahead of my turn. It's surprising it's still doing this since I expect the computer to think faster than I can.

And if this is bad decision making, does anyone know how the curated data and model-training will sort this out? Given that we've moved away from heuristics and all.
 
....but theses appear to be AP crashes. Even if on FSD still need to know the version to obtain the relevance to FSDS/V12.

Yep. The AutoPilot label gets thrown around a lot so hard to know. The article even mentions Autopilot as the recalled software.

Sounds like NHTSA has more pertinent concerns. Specifically the system's inadequate response to in-path objects.
 
And if this is bad decision making, does anyone know how the curated data and model-training will sort this out?
The stuff you're seeing is a result of the limitations of hand-build heuristics, not neural networks. Tesla uses neural networks only while you're on the ramp and then onto the secondary roads. Highway driving and the merges are handled by heuristics.

When Tesla gets around to training a neural network for highway driving, you should see more natural behavior. Neural networks seem to be doing a pretty remarkable job of negotiating complex situations in downtown areas, so we can hope that it'll be equally effective on highway entrance and exit ramps, and on highways in general.
 
Maybe FSD updates will slow given today's NHTSA news? Twenty new crashes since December.



Link: "Given the total number of fatal car crashes in 2022 (42,795), the U.S. average fatal crash rate is nearly 16 deaths per 100,000 vehicles."

That would mean that the 2,000,000 recalled Teslas, if average, would have around 320 fatalities per year, or ~80 since the December recall. NHTSA cites no fatalities at all, only 20 crashes. It appears they are only counting crashes where AutoPilot was active, so these number are not exactly comparable, but my point is that NHTSA appears to be insisting that the recall should have made FSD eliminate all crashes. They do not even compare the crash rates before and after the recall.

We have seen US airline fatalities reach zero per year for decades on end. This was through the NTSB process of investigating, determining and curing the root cause of each and every crash. NHTSA, in contrast, is making perfection the enemy of improvement.
 
When Tesla gets around to training a neural network for highway driving, you should see more natural behavior. Neural networks seem to be doing a pretty remarkable job of negotiating complex situations in downtown areas, so we can hope that it'll be equally effective on highway entrance and exit ramps, and on highways in general.

Any idea when Tesla will move highway driving to the new stack? Is this something we will see in v12.4, 12.5 etc or is it a v13 thing?
 
Any idea when Tesla will move highway driving to the new stack? Is this something we will see in v12.4, 12.5 etc or is it a v13 thing?
Good question. Perhaps other issues have higher priority and the V11 highway stack is good enough to be a bit lower on the list.

Speaking of merging, maybe they will add highways to v12 before they re-merge FSD with the production branch.

If we wanted yet another incorrect guess, we could ask Elon ;-)
 
Link: "Given the total number of fatal car crashes in 2022 (42,795), the U.S. average fatal crash rate is nearly 16 deaths per 100,000 vehicles."

That would mean that the 2,000,000 recalled Teslas, if average, would have around 320 fatalities per year, or ~80 since the December recall. NHTSA cites no fatalities at all, only 20 crashes. It appears they are only counting crashes where AutoPilot was active, so these number are not exactly comparable, but my point is that NHTSA appears to be insisting that the recall should have made FSD eliminate all crashes. They do not even compare the crash rates before and after the recall.

We have seen US airline fatalities reach zero per year for decades on end. This was through the NTSB process of investigating, determining and curing the root cause of each and every crash. NHTSA, in contrast, is making perfection the enemy of improvement.

I think NHTSA needs to respond when they detect a series of safety related issues and/or vehicle owner complaints.

The oversight only gets worse as software is released abroad so good to nip it in the bud now.

Update: NHTSA website:

Currently, states permit a limited number of “self-driving” vehicles to conduct testing, research, and pilot programs on public streets and NHTSA monitors their safety through its Standing General Order. NHTSA and USDOT are committed to overseeing the safe testing, development and deployment of these systems – currently in limited, restricted and designated locations

 
Last edited: