Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Trying to run red light at 5:30.
Wild swerves at 21:30
Confirmation required for left turn at 22:30
Nothing has more degrees of freedom than reality. You could see on the display it was having trouble with this light and it was alternating between green and red.

1627748646240.png
 
FSDBeta 9.1 - 2021.4.18.13 - First Impressions Memorial Park Drive for Comparison to Previous v9

Check it out!

Thanks for your videos! I've been enjoying watching them for a few weeks now and appreciate your commentary and the time/effort you've put into them. The repeated tests you've done give a really good benchmark to track progress.

My experience as a software engineer is that usually after the initial excitement of a product release, especially one that launched a shiny but feature anemic and buggy product, you really only see small and incremental fixes and improvements.

Then one day, months/years later, you're using it and you realize "huh, I haven't experienced a bug/lack of feature for a whole day now" - and it's at that point where the accumulation of small improvements really becomes clear

Humans tend to get really bored really quickly from regular, small, improvements. But they do add up.

So that's how I imagine this to go. This is a very dangerous process for a car though, since it'll be approaching the uncanny valley of autonomous driving soon, and if Tesla tries to push this out the door too quickly, it willbbe a disaster. Fortunately I think even Tesla understands this.

I think putting all non-AP maneuvers behind a confirmation, like you mentioned in this video, is absolutely the way to go. Then they could disable the confirmations in each scenario one by one, as the product matures.

I would also love to say that it's time they disable traffic control confirmations, but thanks to your videos we have a very clear demonstration of why that is a bad idea.
 

That's an average of 1 disengagement per 40 minutes of driving. Personally, I don't think that is a good disengagement rate for FSD. Although it is lacking a lot of info. We don't know the type of disengagement, type of driving, number of miles etc...

If only Tesla would release the disengagement data, then we could maybe quantify the progress better.
Tesla told the DMV their goal is 1-2 million miles per driver interaction so it only has to get 50,000x better. :p
Obviously 1 million mile per actual disengagement is impossible and I don't think 1 million miles per necessary disengagement is possible either and is also not necessary to achieve human level safety.
 
  • Like
Reactions: rxlawdude
Nothing has more degrees of freedom than reality. You could see on the display it was having trouble with this light and it was alternating between green and red.

View attachment 690279
As a fellow human I can figure it out pretty easily, but I've never seen a light like that outside of these videos. Are lights even standardized across the country? There has to be a finite list somewhere of every traffic light combination, right?
 
As a fellow human I can figure it out pretty easily, but I've never seen a light like that outside of these videos. Are lights even standardized across the country? There has to be a finite list somewhere of every traffic light combination, right?
It is standardized. There are only two lights there. Green arrow and red stop light. FSD wants to go straight so it should obey the solid red. If it gets confused by multiple lights then it should default to stop or require driver intervention.
 
#FSDBeta 9.1 - 2021.4.18.13 - Unprotected Left Turns with Drone View - Looping behavior

I'd be interested in seeing it try that turn without any traffic at all; it seemed to have slightly different behavior when it didn't ever enter the "saw cross traffic, waiting for it to pass" state. Also at night - assuming that it's well lit, and the artificial light provides further visibility and contrast than during the day.

Is your impression that, ignoring the fact that it makes the wrong turn, that it's gotten safer? Some of your earlier videos on that turn looked terrifying, when it tried to perform the maneuver without much confidence, and ended up stopping in the middle of the highway.

I wonder if in the end, this scenario is solved by better feedback from the path planning back into the navigation. It could have aborted the left, done a right, then taken a u-turn or three lefts and a right (which is something I do myself sometimes on similar roads). It looks like, as of now, FSD doesn't have the capability of blocklisting maneuvers from the navigation that it discovers it can't complete (at least for the duration of the subsequent re-route). I think that would be a powerful capability.
 
  • Like
Reactions: Battpower
I wonder if in the end, this scenario is solved by better feedback from the path planning back into the navigation. It could have aborted the left, done a right, then taken a u-turn or three lefts and a right (which is something I do myself sometimes on similar roads). It looks like, as of now, FSD doesn't have the capability of blocklisting maneuvers from the navigation that it discovers it can't complete (at least for the duration of the subsequent re-route). I think that would be a powerful capability.
It's tried that before but got into an endless loop of right hand turns
 
Tesla told the DMV their goal is 1-2 million miles per driver interaction so it only has to get 50,000x better. :p
Obviously 1 million mile per actual disengagement is impossible and I don't think 1 million miles per necessary disengagement is possible either and is also not necessary to achieve human level safety.

The average accident rate for humans is approx. 1 accident per 500,000 miles. If you multiply that by 100-200% as Elon mentioned in the recent Earnings Call, you get 1 per 1-2M miles, the same metric that Tesla gave to the CA DMV. So if interventions only refer to interventions that prevented an accidents then it fits perfectly with Tesla's goal of 100-200% safer than the average human. So I think interventions don't refer to all interventions but only to those that prevented an accident. I assume Tesla collects all disengagements from the fleet. Presumably, they are only counting the disengagements that prevented an accident. So when that safety disengagement rate hits 1-2M miles then Tesla will consider FSD Beta to be safer than the average human.
 
FSDBeta 9.1 - 2021.4.18.13 - First Impressions Memorial Park Drive for Comparison to Previous v9

Check it out!

Thanks for the video Chuck. It looks like the unprotected left hand turn didn't get any changes...or ones that haven't enabled it to show noticeable improvements. Hopefully 9.2 brings these fixes. The drone is great for watching that left hand traffic coming.
 
Closer to 40K miles per accident.
That number must count curb rash as a collision...
The average accident rate for humans is approx. 1 accident per 500,000 miles. If you multiply that by 100-200% as Elon mentioned in the recent Earnings Call, you get 1 per 1-2M miles, the same metric that Tesla gave to the CA DMV. So if interventions only refer to interventions that prevented an accidents then it fits perfectly with Tesla's goal of 100-200% safer than the average human. So I think interventions don't refer to all interventions but only to those that prevented an accident. I assume Tesla collects all disengagements from the fleet. Presumably, they are only counting the disengagements that prevented an accident. So when that safety disengagement rate hits 1-2M miles then Tesla will consider FSD Beta to be safer than the average human.
I think 500k is the police reported collision rate so it would leave out a lot of minor collisions.
Of course not every intervention to prevent a potential collision actually would have prevented a collision. Most of the time the other driver will take evasive action.
I do like Tesla's current 1 per 2 million mile human driven number because because it has a clean definition, active restraint deployed (which they claim correlates to any crash over 12mph). Unfortunately it does leave out collisions with pedestrians and cyclists...
 
Right, but if the FSD system aborts the left turn and forces the navigation to re-route, if it could remove that left turn from consideration during the re-route, presumably the car wouldn't get stuck, since it would be trying a different path every time
I thought of that too in previous videos but did not mention it. I think a major problem (which has been brought up before) is Tesla's nav does not allow waypoints. This is a common complaint that has been brought up endlessly. If the nav allowed waypoints, all FSD has to do is set one up a waypoint that avoids that left turn, and it should be fine.

Of course the other question of adding a place to avoid in the navigation, but I don't think most navigation systems allow that. However, I think behind the scenes it should be possible to shoehorn the traffic module to do that (just set the intersection as heavily congested and nav should avoid it).
 
That number must count curb rash as a collision...

I think 500k is the police reported collision rate so it would leave out a lot of minor collisions.
Of course not every intervention to prevent a potential collision actually would have prevented a collision. Most of the time the other driver will take evasive action.
I do like Tesla's current 1 per 2 million mile human driven number because because it has a clean definition, active restraint deployed (which they claim correlates to any crash over 12mph). Unfortunately it does leave out collisions with pedestrians and cyclists...
In Europe, if the active hood in the Model S deploys (in case of pedestrian or cyclist collision) that would very likely trigger the crash module also. This doesn't apply to other regions that doesn't have that feature, but I wonder if the sensor that deploys the hood is still active even if the car doesn't have the mechanism (such that it can detect a pedestrian/cyclist collision).
 
  • Informative
Reactions: Daniel in SD