Huh?
She was striding aggressively into the crosswalk.
I would have to take ownership for that one. FSD blew through that red. And I would expect her to step into the crosswalk as soon as it turns green.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Huh?
She was striding aggressively into the crosswalk.
Yeah it was bad. It did not blow the red (would have if it did not change). But it was obvious that the pedestrian was going to go. And it was obviously completely wrong.I would have to take ownership for that one. FSD blew through that red. And I would expect her to step into the crosswalk as soon as it turns green.
No, this was an everyday case.It didn't seem like an edge case to me.
Why that silly dramatic music?Had an interesting situation where my car (on 12.3.4 but this started with V12.3) stopped too early for a yellow light, braking pretty hard, causing the car behind me to also stop pretty hard, then the car creeped forward as the car got a little too close.
12.3.4 = no real noticable difference from 12.3.3. One time it accelerated on a green light less aggressively, which I found about just right.
Created a YT short:
Neural networks learn things in different ways, and it's not surprising for 12.x to have misunderstood the behavior. End-to-end has needed to learn to make a turn on green… except yield when there's a pedestrian… except continue if the pedestrian hasn't moved for a while… except if the walk signal just changed. This could also be complicated by the variance of how quickly pedestrians respond to signals, but presumably something similar happens at stop signs with pedestrians waiting for an opportunity to enter, so there could be potential for shared learning signals.it was obvious that the pedestrian was going to go… this was an everyday case
This is a case where V12 might have helped since V12 seems to do more mimicking of the car in front behavior.Dashcam video of FSD v12/v11 making no attempt to avoid the bucket, if you're curious. It's particularly odd because the car in front of me did swerve around it; I would have expected FSD to take this into account [through implicit training on similar scenarios], when deciding which obstacles need avoidance. So we can add large plastic buckets to curbs and potholes (and large puddles) on the list of things that Tesla still needs to train its system to avoid!
My 2¢: NOT an edge case, see last sentence.This sums up my general FSD 12.x mood:
View attachment 1038402View attachment 1038403
Poorly done by me (should have had both hands on wheel, probably would not have gotten quite so far if I had been 100% ready, but I really expected the car to come to a halt again at the stop line (as legally required if the light had been red when it arrived there), rather than launching into the turn before the light turned green).
Obviously my reaction to the light turning green is limited, but I was actually watching the pedestrian, NOT the light, so I missed that for a bit. Anyway, here is the earliest I could have physically reacted if I had been looking at the light (car already going 8mph):
View attachment 1038406
Also poorly done by the car. Not cool.
Edge case?
Haha, I'm 65 and I still haven't set foot in a "Gentelmen's Club" (or even a strip club).For some reason I suddenly wish I had 12.3.4.
This is true. I have not had a single issue driving in the neighborhood streets with cars parked all over, even when sometimes there is just about enough room for only one car to pass through at an odd angle.I continue to be impressed with the improvement in navigation on narrow streets packed with parked cars.
This is why when people say Mobile eye are in the lead I laugh. They haven't even remotely deployed their FSD to see if it truly works everywhere. We hear complaints from Teslas fsd because it is literally hitting millions of miles of unknown territory daily. This is the march of 9s when most people believe Tesla haven't even started with the march of 9s. Fsd is already driving perfectly on hundreds of millions of mile of road.This is true. I have not had a single issue driving in the neighborhood streets with cars parked all over, even when sometimes there is just about enough room for only one car to pass through at an odd angle.
FSD seems to have a problem with wider shoulders. On a couple of streets, it tries to drive on the shoulder assuming it’s a lane - sometimes even straddling the “lanes”.
Since it’s not something humans would do, how did it learn that behavior?
This is why when people say Mobile eye are in the lead I laugh. They haven't even remotely deployed their FSD to see if it truly works everywhere. We hear complaints from Teslas fsd because it is literally hitting millions of miles of unknown territory daily. This is the march of 9s when most people believe Tesla haven't even started with the march of 9s. Fsd is already driving perfectly on hundreds of millions of mile of road.
Based on Tesla's message (Full Self-Driving (Supervised) is enabled on your vehicle), I think that the update enabled this without my knowledge. So be aware of this if you have multiple profiles.I have two profiles, Al FSD and Al No FSD. After some surprising behavior, I checked and found that FSD had been switched on in the latter. No way I did that.
FSD seems to have a problem with wider shoulders. On a couple of streets, it tries to drive on the shoulder assuming it’s a lane - sometimes even straddling the “lanes”.
Since it’s not something humans would do, how did it learn that behavior?