Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I decided to start viewing my 10.12.2 drive videos and document the disengagements and interventions. I'll also document a couple of earlier releases to get comparison. Its a slow process though - but hopefully going forward I'll just document that day's drive, so it gets easier.

6/2/2022 : 2.14 miles per disengagement (10.7 / 5), 0 interventions
6/5/2022 : 1.5 miles per disengagement (12 / 8), 6 interventions

One interesting thing I noticed was a brief error that comes on the screen I've never noticed - Traffic Control Detection Maybe Degraded. This came up couple times just before FSD made a mistake. Once taking the right turn lane instead of going straight, another time giving up on a left turn with little space because a vehicle stopped well after the stop line leaving not much space to take the left turn.
 
I decided to start viewing my 10.12.2 drive videos and document the disengagements and interventions.

Here are the numbers for 10.12.2 (till yesterday). About half the interventions are because of roundabouts.

1654907738315.png
 
FSDb 10.12.2 showed evidence of recognizing a "Do Not Enter" sign. Here's the sign in question:


I'm driving straight down this road, and my car slows to a stop right before that sign. This has happened once in the past when a motorcycle had cut in front of me from a road on the left. Looks like that was a red herring, as today, there were no cars around.

Yes, the car is wrong to stop for that sign, which is actually intended for the side road. But it's interesting that it would even make this mistake, since we saw earlier in the release that Rob Maurer's car went through TWO "DNE" signs. Now looking back, it seems like FSDb was putting precedence on map data (the segment where his car erred was normally 2-way instead of 1-way) than sign reading.

But it would seem in some conditions, the car can interpret the DNE sign.
 
  • Informative
Reactions: Phlier
10.12.2 Drive.

Unmarked winding country road failure,

I've tried this road several times and FSD is always terrible. The road is narrow and the car MUST stay very close to the right hand side of the unmarked road around corners. Typically at 15mph. I have to disengage around any corner when a car is coming from the other direction. FSD is just too far to the left. Unfortunately 10.12.2 is no better on this specific road then builds I started out with last October.

It's great that FSD can go down Lombardo Street in San Francisco but clearly this narrow winding road like others in New England are harder to drive,

Reported to Tesla again.

 
10.12.2 Drive.

Unmarked winding country road failure,

I've tried this road several times and FSD is always terrible. The road is narrow and the car MUST stay very close to the right hand side of the unmarked road around corners. Typically at 15mph. I have to disengage around any corner when a car is coming from the other direction. FSD is just too far to the left. Unfortunately 10.12.2 is no better on this specific road then builds I started out with last October.

It's great that FSD can go down Lombardo Street in San Francisco but clearly this narrow winding road like others in New England are harder to drive,

Reported to Tesla again.


I used to go to Great Brook all the time. I remember North Rd. Yeah, FSD would probably just drive down the center. The unmarked roads in my neighborhood is almost 3x the width of North Rd, but the car still has issues hogging the lane. I guess we have to keep snapshotting.
 
I have around 50 miles on this version now.

It is performing pretty much the same as the previous version in my area, unfortunately.

Most of my posts lately have been pretty negative as far as how FSD Beta has been performing, so rather than add any more fuel to the fire by posting a few new things it's doing wrong, I'm just gonna leave it at this...

I look forward to the next version. :)
 
...It's great that FSD can go down Lombardo Street in San Francisco but clearly this narrow winding road like others in New England are harder to drive...
Only because Lombard is tested and I guess part of training.
Its kind of how FSD has problems with small roundabouts.
This illustrates the key question, which I think is difficult for us to answer based on user reports, is whether FSD is solving general capabilities or fitting to (even a very large set of) captured training cases.

Let's say the Tesla data-capturing network (i.e. customer cars) expands greatly along with the training-computer capacity (Dojo or whatever). Then it will become increasingly common for users to happily report that a previously troublesome case has been solved. But, like Lombard Street vs. some random winding country road, is the observed improvement based on improved general capability, or on a trained response to that familiar test case?

Karoathy alluded to this last year when he was discussing neural-net vs. stored-map approaches. He mentioned that a sufficiently large training set would effectively include an NN map of the entire world.

From one perspective, that sounds great because it seems to be an answer to the HD Maps debate and, setting aside theoretical methodology debates, after all seemingly gets us very close to to "solved FSD". But - it isn't what we're looking for if it isn't inferring generalized solutions to untrained roads, or trained-on roads with major sudden changes due to accidents, construction or major-event traffic-control changes.

This question is essentially the classic Turing conversation-test issue applied to driving. If an AI computer answers many hours of conversational discussion on wide-ranging topics, but it's really just a bunch of canned answers and a fast look-up table, then eventually it will trip up and maybe spectacularly so.

I'm not at all claiming that the Tesla engineers aren't on top of this - I see way too many smartass critics here on TMC who like to pretend they've thought of things that Tesla never did. What I'm really musing about is how FSD testers, and the interested user community, can tell whether improved behavior on a favorite test route is really a general FSD capability Improvement - or just a new but overly-specific training fit based on data they've helpfully supplied from past drives.
 
This illustrates the key question, which I think is difficult for us to answer based on user reports, is whether FSD is solving general capabilities or fitting to (even a very large set of) captured training cases.

Let's say the Tesla data-capturing network (i.e. customer cars) expands greatly along with the training-computer capacity (Dojo or whatever). Then it will become increasingly common for users to happily report that a previously troublesome case has been solved. But, like Lombard Street vs. some random winding country road, is the observed improvement based on improved general capability, or on a trained response to that familiar test case?
The assumption is that NN trained on diverse enough cases learns things that are “general” to that set. Training set selection is obviously a difficult task - like Elon was saying - you can overfit to Lombard.

BTW, isn’t Lombard a one-way ? That could explain it .. that road to Great Brook is two-way. If it was 1 way, FSD would handle it ok.
 
I checked all my 10.3.1 videos (earliest I've with GoPro). Here is the comparison.

1654998616554.png


Definitely better, but also surprised at some of the items that haven't changed.
- I was basically disengaging at all roundabouts because of stopping/very slow & jerky in 10.3.1. Now I have to disengage only when it doesn't work properly (about 30% or so)
- Unmarked roads were very bad esp. when a car came in the opposite direction
- At large intersections it used to turn into wrong lane (like the wrong side of the road)
 
Last edited:
I used to go to Great Brook all the time. I remember North Rd. Yeah, FSD would probably just drive down the center. The unmarked roads in my neighborhood is almost 3x the width of North Rd, but the car still has issues hogging the lane. I guess we have to keep snapshotting.
Sometimes even left of center. The difficulty is the tight and blind turns. FSD just has to assume a car is coming in the opposite direction and stay far right. In time FSD will handle this winding road by doing just that.

I also bike this road and most cars don't even bother to try and pass. Besides they cannot go any faster so I'm not holding them up:)
 
Sometimes even left of center. The difficulty is the tight and blind turns. FSD just has to assume a car is coming in the opposite direction and stay far right. In time FSD will handle this winding road by doing just that.

I also bike this road and most cars don't even bother to try and pass. Besides they cannot go any faster so I'm not holding them up:)

It’s also a problem on regular marked 40-50mph country roads. There will be a curve to the right, sometimes sharp enough to have a yellow lower speed limit sign and sometimes not, but with bushes, trees, or houses on the right impeding the view around the corner. FSD beta will drift to the left all the way over to the center line stripes. Meanwhile, oncoming traffic is also cutting to their left, e.g. toward the center line. And you can’t see them coming until they’re there. Any human driver on my side would stay toward the right of the lane through the curve, so you don’t look like you’re going head-on if there turns out to be an oncoming car, but FSD doesn’t. I have to intervene and pull right if it turns out a car is coming. I’m not sure if it wants a better view ahead or wants to leave room in case it suddenly overtakes a bicycle or what, but the appearance of heading for a head-on collision is not OK to me.

I like that it drifts right when there’s a big truck in the oncoming lane; I think it needs to do that for somewhat smaller vehicles (e.g. pickup pulling landscaping trailer, dump truck, or delivery van) and also around those right curves with either obstructed visibility or *any* oncoming traffic in view.
 
It’s also a problem on regular marked 40-50mph country roads. There will be a curve to the right, sometimes sharp enough to have a yellow lower speed limit sign and sometimes not, but with bushes, trees, or houses on the right impeding the view around the corner. FSD beta will drift to the left all the way over to the center line stripes. Meanwhile, oncoming traffic is also cutting to their left, e.g. toward the center line. And you can’t see them coming until they’re there. Any human driver on my side would stay toward the right of the lane through the curve, so you don’t look like you’re going head-on if there turns out to be an oncoming car, but FSD doesn’t. I have to intervene and pull right if it turns out a car is coming. I’m not sure if it wants a better view ahead or wants to leave room in case it suddenly overtakes a bicycle or what, but the appearance of heading for a head-on collision is not OK to me.

I like that it drifts right when there’s a big truck in the oncoming lane; I think it needs to do that for somewhat smaller vehicles (e.g. pickup pulling landscaping trailer, dump truck, or delivery van) and also around those right curves with either obstructed visibility or *any* oncoming traffic in view.
Do you find FSDb slows down for the yellow warning signs? For instance, does it drop to 35 in an otherwise 50mph zone?