Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
This sums up my general FSD 12.x mood:

View attachment 1038402View attachment 1038403
Poorly done by me (should have had both hands on wheel, probably would not have gotten quite so far if I had been 100% ready, but I really expected the car to come to a halt again at the stop line (as legally required if the light had been red when it arrived there), rather than launching into the turn before the light turned green).

Obviously my reaction to the light turning green is limited, but I was actually watching the pedestrian, NOT the light, so I missed that for a bit. Anyway, here is the earliest I could have physically reacted if I had been looking at the light (car already going 8mph):
View attachment 1038406

Also poorly done by the car. Not cool.

Edge case?

It didn't seem like an edge case to me. More like v12 training puts too much weight on the lead vehicle's action. The current approach seems to cut corners in an effort to make v12 feel more capable than it actually is and that's when the excrement is more likely to hit the proverbial fan.

 
I would have to take ownership for that one. FSD blew through that red. And I would expect her to step into the crosswalk as soon as it turns green.
Yeah it was bad. It did not blow the red (would have if it did not change). But it was obvious that the pedestrian was going to go. And it was obviously completely wrong.

The video is super clear, and I have proven that she stepped into the road (as was her right). I am not sure what FSD was doing; it was completely wrong and completely predictable.

If I had been driving manually it would have been no issue, I would have been snugged up against the car in front and tried to get through that intersection before the light turned green. If that hadn't worked out (might not have) obviously there would have been no "close call."

It didn't seem like an edge case to me.
No, this was an everyday case.
 
  • Like
Reactions: SidetrackedSue
Had an interesting situation where my car (on 12.3.4 but this started with V12.3) stopped too early for a yellow light, braking pretty hard, causing the car behind me to also stop pretty hard, then the car creeped forward as the car got a little too close.

12.3.4 = no real noticable difference from 12.3.3. One time it accelerated on a green light less aggressively, which I found about just right.

Created a YT short:
Why that silly dramatic music?
 
it was obvious that the pedestrian was going to go… this was an everyday case
Neural networks learn things in different ways, and it's not surprising for 12.x to have misunderstood the behavior. End-to-end has needed to learn to make a turn on green… except yield when there's a pedestrian… except continue if the pedestrian hasn't moved for a while… except if the walk signal just changed. This could also be complicated by the variance of how quickly pedestrians respond to signals, but presumably something similar happens at stop signs with pedestrians waiting for an opportunity to enter, so there could be potential for shared learning signals.

Potentially the current neural networks already have an internal perception and understanding of walk signals, but control predictions don't give enough weight to that information. The training process to adjust the weights to pay more attention to walk signals could boost the value from say 10% to 50% does not change the size of the network, so it doesn't affect the runtime compute and memory requirements of the system. Hopefully "just" more disengagements and training data should address this issue.
 
Running 12.3.4 for a local trip. Similar to 12.3.3, it misreads numbers on roadside signs as speed limit signs.

In this case, in a 45 mph zone it dropped the speed to 20 mph for about a mile until encountering a proper 45 mph sign.

The general impression was that it kept under the posted speed mostly.

It made one left turn at an uncontrolled junction that seemed poorly done. I will have to try that one again.

I continue to be impressed with the improvement in navigation on narrow streets packed with parked cars.
 
Last edited:
  • Informative
Reactions: FSDtester#1
Dashcam video of FSD v12/v11 making no attempt to avoid the bucket, if you're curious. It's particularly odd because the car in front of me did swerve around it; I would have expected FSD to take this into account [through implicit training on similar scenarios], when deciding which obstacles need avoidance. So we can add large plastic buckets to curbs and potholes (and large puddles) on the list of things that Tesla still needs to train its system to avoid!
This is a case where V12 might have helped since V12 seems to do more mimicking of the car in front behavior.
 
This sums up my general FSD 12.x mood:

View attachment 1038402View attachment 1038403
Poorly done by me (should have had both hands on wheel, probably would not have gotten quite so far if I had been 100% ready, but I really expected the car to come to a halt again at the stop line (as legally required if the light had been red when it arrived there), rather than launching into the turn before the light turned green).

Obviously my reaction to the light turning green is limited, but I was actually watching the pedestrian, NOT the light, so I missed that for a bit. Anyway, here is the earliest I could have physically reacted if I had been looking at the light (car already going 8mph):
View attachment 1038406

Also poorly done by the car. Not cool.

Edge case?
My 2¢: NOT an edge case, see last sentence.

FIRST the Walk light was on so no MATTER what pedestrians have the right of way. I don't think FSD "sees" Walk lights yet but needs to be trained.

Believe I see where the car fails but human drivers don't. She seems to start with more of a "head" motion of intent that we as humans recognize instantly. I think the cars is looking at the entire body and not perceiving this intent in time.

BUT AGAIN as a inner city runner. WALK LIGHTS are what matters PERIOD and any pedestrian OWNS that crosswalk.

In ATL starting in 25 Right on Red will be illegal in most of the city.

EDIT: Also I have been hit by a Right on Red driver (very minor) and MANY, many hindrances or near misses.
 
Last edited:
I'm appreciating pedestrian near car behaviour discussion.

TLDR: what will the NN make of waving poles carried by crazy old ladies swinging at the car while they walk across the street?

As part of our carbon footprint reduction, we moved to a "15 minute" neighbourhood where the vast majority of what I need is within a 15 (or in my case 20 minute cuz I'm old and slower) walk. Bus is also at the door. So I'm primarily a pedestrian for roughly 70% of my errands.

Cycling is also big in our neighbourhood (and commuting cyclists also come through here) and on collector roads there are either bike lanes on the road (dangerous for the cyclists) or cycle paths next to the sidewalk, separated from the road by a curb. That complicates intersections for drivers because they have to watch not only for pedestrians approaching the crossing but also bikes traveling at much higher speed than a pedestrian would be.

I have to pay attention to the bikes as well. I am unsteady on my feet due to a concussion from being knocked down by a bike as I exited a store (ironically not here, but at a big-box mall in the suburbs) and was hit from the side when I didn't see it due to a pillar and never anticipated it.

Unfortunately, our neighbourhood gets filled with commuting traffic due to a bridge and the many attempts using signage, light times, and turn restrictions, to stop the hits and nears misses are for naught.

Adding in FSDS is scary for me
. The province from which the commuters come is the highest concentration of EVs in our country and my neighbourhood is a granola-eating one so also has a high concentration. With V12 going out to everyone (eventually, I'm so sorry for those stuck on 2024.8.x) this increases the risk to me on foot. I already walk with poles for balance and wave them in the air erratically as I cross the street when there are cars that may be crossing the crosswalk. If they are turning left or right with the light they tend to not watch for pedestrians; those turning right on red are looking to their left before making the turn illegally without coming to a complete stop and fail to look for pedestrians coming from their right to crossing with the light.

I count on the erratic pole movement in the air to catch their attention. If not, then the pole actually hitting their car as they blow past me will. This is not an speculative risk, I've grazed cars with the poles as they nearly hit me and I have a friend who was struck by a vehicle as she crossed with the light and is still recovering mental acuity long after the physical injuries mended.

As FSDS doesn't have a mic to pick up my screaming at them for attention, it will be my poles that may save me (not to mention the high-viz vest I use on dull days or after night.)

What remains to be seen is what the NN will make of waving poles since crazy old ladies swinging poles at the car while they walk across the street is definitely an edge case.
 
Think 12.3.4 was just to add some legacy vehicles and then move all of us forward as a group
2024.8 for those that will never get v12
12.4 should be out next big step in function, eliminate more edge cases
Don’t get me wrong, edge cases can be serious

Let’s go Tesla, keep this development train going
We want stellar performance by 8-8-2024
 
This is true. I have not had a single issue driving in the neighborhood streets with cars parked all over, even when sometimes there is just about enough room for only one car to pass through at an odd angle.
This is why when people say Mobile eye are in the lead I laugh. They haven't even remotely deployed their FSD to see if it truly works everywhere. We hear complaints from Teslas fsd because it is literally hitting millions of miles of unknown territory daily. This is the march of 9s when most people believe Tesla haven't even started with the march of 9s. Fsd is already driving perfectly on hundreds of millions of mile of road.
 
FSD seems to have a problem with wider shoulders. On a couple of streets, it tries to drive on the shoulder assuming it’s a lane - sometimes even straddling the “lanes”.

Since it’s not something humans would do, how did it learn that behavior?

They might have overfit scenarios where the car needs to drive on the shoulder to get around accidents.

On a trip of mine yesterday, a crash and police blocked a lane, and despite the fact that one other legal lane was still open, other drivers were driving on the shoulder to get around it. FSD followed the other drivers into the shoulder, and then merged back into traffic once it was beyond the crash.

So it might have been appropriate behavior in my scenario (even if technically illegal, I doubt the police would enforce the law). But 99% of the time it should not drive on the shoulder.
 
This is why when people say Mobile eye are in the lead I laugh. They haven't even remotely deployed their FSD to see if it truly works everywhere. We hear complaints from Teslas fsd because it is literally hitting millions of miles of unknown territory daily. This is the march of 9s when most people believe Tesla haven't even started with the march of 9s. Fsd is already driving perfectly on hundreds of millions of mile of road.

FSD is far removed from the march of nines. Aside from the many design issues/bugs, HW3/HW4 will never know about scene context and instead puts all it's eggs into superficial scenery assumptions and hack shortcuts. FSD doesn't have the capability to respond when the assumptions are incorrect. As an example, FSD is traveling 40mph and a vehicle is a few seconds ahead waiting to enter traffic flow from a parking lot. FSD assumes that vehicle will not enter the path and so FSD continues at the same speed and in reality has no chance of responding in time, with steering or brake, when that vehicle enters the flow. FSD doesn't slow, give more separation, or cover the brakes like a safe attentive driver would.

Here's another example @2:06. The lead vehicle slows and FSD keeps the same speed and changes lanes to pass. Of course FSD's responds with a lazy lane change so the ego might be much closer to the lead vehicle than a safe attentive driver might want. If the lead vehicle pulls into FSD's path, FSD is unable to avoid the accident. I would argue these are sub-nine-like capabilities.


Gotta be able to safely respond to the unexpected for the march of nines.
 
Last edited:
  • Informative
Reactions: primedive
Very weird behavior yesterday when I drove from Cleveland to Erie it was ping-ponging on the freeway like crazy. On my way back today it’s very stable within the lines with just a little bit of ping-ponging. Any ideas on why? Now that I think about it, it could’ve been the wind blowing the car around in the car just reentering from the wind.
 
Last edited:
I have two profiles, Al FSD and Al No FSD. After some surprising behavior, I checked and found that FSD had been switched on in the latter. No way I did that.
Based on Tesla's message (Full Self-Driving (Supervised) is enabled on your vehicle), I think that the update enabled this without my knowledge. So be aware of this if you have multiple profiles.

Does Tesla enable this without you having to confirm the legal liability disclaimer?

I turn on the No FSD profile if I don't want to drive 10% below the limit and the FSD profile when the car thinks the limit is 30 mph in a 55 mph zone.