Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
Somehow, it has training saying its OK to go around a stopped car, but not enough training to include all the exceptions to the training
End-to-end might "just" need more training of exceptions and then exceptions to those exceptions and so on. This reminds me of neural network chess engines learning that it's good to capture enemy pieces except when it would leave your pieces open to attack except when it's okay to lose those pieces because you can checkmate.

Hopefully the process of detecting disengagements to further train 12.x will provide enough ongoing signal to learn the correct behavior. Although this might result in some cycles of features and regressions where the neural network might initially do the right behavior for the wrong reason then learns not to do the behavior for a similar but different situation then needs to learn why the original behavior was actually correct but now for the right reason.
 
  • Like
Reactions: fcastrillo86
How did it know the amount of space in front of the box truck?

Anyway it was something like 100-150 feet to the truck from the border of the keep clear zone closer to the intersection.
Looks like it is 300 feet to the intersection (measure keep clear edge to crosswalk).

So something more than 200 feet to the box truck.

Just a completely silly and risky move. Remember that video cameras tend to compress perceived distances for whatever reason.

IMG_0362.png

 
Last edited:
Here's a case of V12 failing to yield (or at least being more assertive than it should be) at a somewhat strange stop sign.


As a human, I'm a bit confused by the intersection layout. Do the oncoming cars have a stop sign (past the road coming from the left)? I think they do, meaning FSD should have yielded because the other got got to the stop sign first? Does CA not use 4-Way Stop signs?
 
There was NO reason for it to stop. the lead car was moving and never stopped. Even had it stopped (which it didn't) there was still room for another car before the "box" (what we have and call in ATL).
At the moment ego reached the intersection, the car ahead had not yet cleared the keep-clear area, and was moving very slowly. Pausing maybe made sense. Once ego had paused, the approaching car could have pulled into the path, and in fact did so later on, so proceeding would have been risky. Had the lead car stopped, ego's decision would have proven correct.

I think I would have instead slowed but kept creeping for a moment letting the situation ahead resolve one way or the other, making my stop or go an obvious decision. I'd also have tried to see well ahead to guess if the lead was likely to proceed. It is hard to see the light ahead in the video, but it does look like there is a white car stalled in the lane, which the red lead car changed lanes to the left to avoid.
 
End-to-end might "just" need more training of exceptions and then exceptions to those exceptions and so on. This reminds me of neural network chess engines learning that it's good to capture enemy pieces except when it would leave your pieces open to attack except when it's okay to lose those pieces because you can checkmate.

Hopefully the process of detecting disengagements to further train 12.x will provide enough ongoing signal to learn the correct behavior. Although this might result in some cycles of features and regressions where the neural network might initially do the right behavior for the wrong reason then learns not to do the behavior for a similar but different situation then needs to learn why the original behavior was actually correct but now for the right reason.
Exactly. Now multiply that by every possible scenario while driving. I think we vastly underestimate the complexity of our own experience that we use for driving, and we dangerously anthropomorphize our experience onto the AI driver when we see it doing some things very well. That does not mean it can do everything it needs to. It needs to be able to drive several lifetimes without ever making a mistake.

Call me when it goes a million miles without a disengagement, then we can have a real discussion about this or that shade of grey in its behavior. Until then, this whole thread is all anecdotal speculation around "hey, wouldnt it be cool if ......" over beers at the pub.
 
and was moving very slowly.
No it was not. At least 15mph. Looks like about 20mph.

The red car subsequently traverses about 250 feet in 12 seconds, so that is an average of 14mph (including a slow roll to a stop in the turn lane)!

Emergency stopping distance from 20mph is about 14 feet, so at that location and speed, room on the other side of the “keep clear” box was guaranteed.
but it does look like there is a white car stalled in the lane, which the red lead car changed lanes to the left to avoid.
There was an oddly tall and thin truck proceeding normally to the traffic light 300 feet ahead. No stalled vehicles.
 
Last edited:
It no longer visualizes traffic cones or trash cans (except on the freeway where the old software stack is running)
Seems like Tesla was freeing up compute enough for 11.x and 12.x to partially run at the same time. Those visualizations were originally from FSD Visualization Preview well before FSD Beta.

Curious, do you still get lane departure warnings? At least early during FSD Beta, the visualization would switch back to basic Autopilot to highlight the departure lane line seemingly to indicate it was powered by the even older stack.

Have others seen any videos of 12.x from USS-less / High Fidelity Park Assist vehicles? Does that still activate or maybe it was temporarily removed for additional compute?
 
  • Like
Reactions: JB47394
Do the oncoming cars have a stop sign (past the road coming from the left)? I think they do, meaning FSD should have yielded because the other got got to the stop sign first? Does CA not use 4-Way Stop signs?
Yes the oncoming cars have a stop sign. It’s an extended offset 4-way stop with a driveway entering it. FSD did yield eventually it seems. Sometimes stop signs in California are marked. There will be signs like “cross traffic does not stop” on occasion. But even in those cases you are often just supposed to know whether cross traffic has stop signs! The helpful signs are not always used, for example. This means you have to be very careful and examine the intersection pavement markings (often worn off) and look for the back of stop signs to figure it out.


Here is an example of FSD Beta failing to yield (floats into intersection at 7mph) on a flashing red (driver incorrectly thought he had right of way and was cut off (other driver had made an approximate stop)). It’s 3/4 of the way into the clip.

 
Last edited:
  • Informative
Reactions: mgs333
Here is an example of FSD Beta failing to yield on a flashing red (driver incorrectly thought he had right of way and was cut off). It’s 3/4 of the way into the clip.
It’s funny that he calls out the flashing red light in the tweet but FSD beta doesn't actually stop at the light.
Still this clip shows the power of neural nets get complicated scenarios mostly right.
 
My read (aka total guess) on the v12 situation:

After the initial release, Tesla got enough disengagement info and feedback via the cars and social media that they recognized some weak/danger spots that they felt needed to be addressed before wider release. They have been adding more video of unprotected lefts and other scenarios and retraining a new network, along with fixing the autospeed/hesitancy issues.

My guess is that it takes about 2 weeks of compute time currently to retrain the network. So I think in about a week they’ll release 12.3 to employees again, slowly expanding as they verify the issues are fixed or at least improved. I estimate 12.3 will start dribbling to non-Tesla employees on March 12 at 4:37pm, plus or minus 3 minutes.

I do not expect the public will get anything less than 12.3 at this point. End-to-end makes unit tests even more important because the networks are a black box. They need to be able to run simulations through their new networks and determine that the new vehicle behaviors don’t regress. Tests were important before. But now, without explicit logic and everything being handled by a massive mathematical matrix, they’re even more important.
 
Last edited:
After the initial release, Tesla got enough disengagement info and feedback via the cars and social media that they recognized some weak/danger spots that they felt needed to be addressed before wider release.

That's my take as well (and seeing it's tendency to bit a bit overly assertive, IMO they made the right decision to halt this one, as annoyed as I am I can't try it sooner!).

How did you figure the 2 weeks timeline to retrain? Is that based on the time between previous V12 releases? Anyone have the history of V12 releases somewhere without having to go root around in Twitter for an hour?
 
Tesla is focusing on FSD that can drive anywhere but I've said for a long time, even Level 3 on the highway would be a boon and something a lot of people would gladly pay for.
100%. L3 on limited access highways is all I’ve ever wanted from FSD. The rest is just parlor tricks with no real utility. But, until some new leadership comes along without naive visions of robotaxis dancing in their heads I’m afraid we won’t see a refocusing on a usable system. And NoA will continue to languish in mediocrity just as it has since 2018.
 
That's my take as well (and seeing it's tendency to bit a bit overly assertive, IMO they made the right decision to halt this one, as annoyed as I am I can't try it sooner!).

How did you figure the 2 weeks timeline to retrain? Is that based on the time between previous V12 releases? Anyone have the history of V12 releases somewhere without having to go root around in Twitter for an hour?
I can’t remember where I heard it (maybe it was a former AI day) but Karpathy had said at the time that it took a week or two to train the perception network. This is a wild guess but I was thinking they have more data and a larger network now (yes it’s a different network) but they also now have more compute. So I just spitballed two weeks. I could be off by a lot, I have no idea.
 
Anyone have the history of V12 releases somewhere without having to go root around in Twitter for an hour?
I previously estimated 12.x(.0) release cycles at 5 weeks, but that includes the holidays. Perhaps a wider release will need to wait for 12.3 training as opposed to 12.2.2 adjustments?

November 24: 2023.38.10 / 12
December 24: 2023.44.30.10 / 12.1
January 12: 2023.44.30.11 / 12.1.1
January 21: 2023.44.30.12 / 12.1.2
February 9: 2023.44.30.15 / 12.2
February 18: 2023.44.30.20 / 12.2.1
 
  • Informative
Reactions: gsmith123
That's my take as well (and seeing it's tendency to bit a bit overly assertive, IMO they made the right decision to halt this one, as annoyed as I am I can't try it sooner!).
You think you are annoyed. The YouTube stars residing outside CA are in full pout mode as their income streams dry up. Maybe we could start GoFundMe’s for each of them😆
 
You think you are annoyed. The YouTube stars residing outside CA are in full pout mode as their income streams dry up. Maybe we could start GoFundMe’s for each of them😆
Hey, don’t forget those guys are all Tesla shills according to TMC wisdom. Everything they say is biased to favour Tesla because they are Tesla’s favourites, which is why Tesla are not giving them this update despite them being vocally desperate for it.

Oh wait, that doesn’t make any sense, does it.
 
Hey, don’t forget those guys are all Tesla shills according to TMC wisdom. Everything they say is biased to favour Tesla because they are Tesla’s favourites, which is why Tesla are not giving them this update despite them being vocally desperate for it.

Oh wait, that doesn’t make any sense, does it.
Actually it does. Elon has gone full hardcore. Remember Chuck ruined it for everyone by muting Elon, not realizing that the very existence of humanity was at stake.

All of the above in your post can be true. (Note: the nonsensical random selection method strongly suggests an edict from above.)

Remember that the epitome & apotheosis of obsequiousness, Whole Mars, was still early access.
 
  • Like
Reactions: lzolman
There was very little progress with 11.x (released March 2023) for most of last year probably because Tesla switched priorities in April to focusing on 12.x. Indeed, there's a lot of new behaviors and much improved comfort with end-to-end that now completes whole trips for you, so it'll be interesting if others are able to experience these significant improvements on their drives as well.

Before 11.x release, FSD Beta usage averaged around 12M mi/mo on city streets, and now with highways, usage has jumped up closer to 80M mi/mo. Potentially usage will similarly significantly increase with wide release of 12.x if your experience is common across the fleet.
There are no progress with V11, the V11 are no different than the version back in 2021, v11 will be the same 30 years from now.