Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.

lots of little errors and interventions. Mostly minor, but a couple major.

17:50 car does a turn it had trouble with before.

20:40 - impressive moment. Car changes lanes too slowly to make a left turn. Instead of going nuts trying to force it, it moves on and re-routes. It was handling a partial U, but driver intervened because it was going to the middle lane instead of the right hand lane.
 
Last edited:
Same here. I wish they tried to at least fix things like not swerving into the middle of a merging lane, or manage regular curves without freaking out
My favorite is when there's going to be a 2-lane-wide exit and the new lane appears and it swerves sharply to try and track the right side of the lane as it expands, before the lane lines appear, instead of just chilling the hell out and making the lane change 2 seconds later.

Not sure if that makes sense but I'll get video if not. Easily reproduced, highly annoying.
 

lots of little errors and interventions. Mostly minor, but a couple major.

17:50 car does a turn it had trouble with before.

20:40 - impressive moment. Car changes lanes too slowly to make a left turn. Instead of going nuts trying to force it, it moves on and re-routes. It was handling a partial U, but driver intervened because it was going to the middle lane instead of the right hand lane.

You missed a major safety disengagement at 5:39 to avoid the car rolling through the stop sign. That was definitely a close call. And sure, the other car would have been at fault but IMO that is an accident that a good autonomous car should be able to avoid on its own. FSD Beta did not appear to react quick enough to the car on a collision course. Remember that autonomous cars are expected to try to avoid accidents even when the other car would have been at-fault, if it is reasonable to do so without causing another accident. I think that was definitely one of those cases where FSD beta should have been able to avoid that accident without a disengagement even though the other car would have been at-fault. In any case, safety disengagements like that where there is a near accident, should definitely be analyzed and fixed in order to improve the safety of the AV.
 
Last edited:
  • Like
Reactions: Matias
From the most recent presentation this year, the snippets being sent to Tesla are 10 seconds, so I presume that is the current persistence. Also it appears they only recently switched to persistence of previous data. I discussed this a bit here. Previously they were doing things frame by frame and only using a smoothing function to smooth things out. Now they are treating things more like "video" and getting velocity and acceleration data from it (thus being able to replace radar).
Tesla.com - "Transitioning to Tesla Vision"

I don't know how this persistence extends however to the other decision making of the car beyond that.
The 10 seconds vid is only used for training they have been collecting 10 seconds vids since the very beginning in 2017.
 
  • Like
Reactions: diplomat33
You missed a major safety disengagement at 5:39 to avoid the car rolling through the stop sign. That was definitely a close call. And sure, the other car would have been at fault but IMO that is an accident that a good autonomous car should be able to avoid on its own. FSD Beta did not appear to react quick enough to the car on a collision course. Remember that autonomous cars are expected to try to avoid accidents even when the other car would have been at-fault, if it is reasonable to do so without causing another accident. I think that was definitely one of those cases where FSD beta should have been able to avoid that accident without a disengagement even though the other car would have been at-fault. In any case, safety disengagements like that where there is a near accident, should definitely be analyzed and fixed in order to improve the safety of the AV.
There were so many errors I was too lazy to mark them all. That’s why I just Said there were a lot of errors, some of them major, in my post.
 
FSD beta V9.1 tries to drive into a road closed sign:

XKqdNTU.png


And by the way, this is the exact same spot that FSD beta handled well before. They moved the road closed sign and it seems to have caused a regression in V9.1.

Here, FSD Beta V9.1 was going to drive over a curb in an empty parking lot:

e7XUXgA.png



Over all, FSD Beta V9.1 can do some things really well but it also has some real head scratching WTF moments.
 

Over all, FSD Beta V9.1 can do some things really well but it also has some real head scratching WTF moments.
Here are two times that it proceeds without any caution. 15mph through the alley into two blind corners. Anything could be walking or driving across the exit or someone could open a door but it does not consider the risk.

FSD1.png



At W. Kinzie and Ashland it makes the right turn without caution to the bike/pedestrian lane or road lane. Either of those users could be approaching from the left at speed.
Even if FSD is able to see at that angle, which I question, if you're going to turn while sticking your nose out you have to do it slowly. At the very least to allow the other users time to react, slow down, etc. Just appearing from a blind lane is going to cause you to hit them.

FSD only gets away with these two situations because nobody was there. It would have had no chance to avoid a collision.

FSD2.png
 
FSD beta V9.1 tries to drive into a road closed sign:

XKqdNTU.png


And by the way, this is the exact same spot that FSD beta handled well before. They moved the road closed sign and it seems to have caused a regression in V9.1.
The road appears to be open today, the car probably could have got through but we didn't get a good enough look. The umbrella might be high enough to get under it. Anyway going around on the road is a better move. For some reason FSD splits the difference between two possible valid options and just goes right in the middle towards the sign. The worst choice.
 
  • Helpful
Reactions: diplomat33
It is good that that guy has cat-like reflexes. “Just cruising through the alley at 17mph.” I know his camera makes it look tighter than it is…but still. Those poles are just a fraction of a second away. Huge amounts of trust in the machine, and one’s own reaction times.
View attachment 692015

Frenchie even says that the car was going too fast for his liking.

Here are two times that it proceeds without any caution. 15mph through the alley into two blind corners. Anything could be walking or driving across the exit or someone could open a door but it does not consider the risk.

View attachment 692066


At W. Kinzie and Ashland it makes the right turn without caution to the bike/pedestrian lane or road lane. Either of those users could be approaching from the left at speed.
Even if FSD is able to see at that angle, which I question, if you're going to turn while sticking your nose out you have to do it slowly. At the very least to allow the other users time to react, slow down, etc. Just appearing from a blind lane is going to cause you to hit them.

FSD only gets away with these two situations because nobody was there. It would have had no chance to avoid a collision.

View attachment 692078

Yeah, driving through that alley and taking those blind turns like that was very risky too. That's a good example of why FSD Beta is not ready for wide deployment yet. Imagine if 1M Tesla owners were using FSD Beta in those conditions and maybe some of them get complacent and don't pay attention. There could be tragic accidents.

Frankly, I am not sure why FSD Beta even decided to drive through the narrow alley instead of just rerouting. That seemed very odd to me. I don't think any human driver would do that. Or if they did, they would drive much slower.
 
The maroon SUV has the right of way even though it was doing a U-turn. It's in the intersection before FSD gets there. FSD does not yield.

Stop.png


Yield.png



Sure, just go straight over the curbs, sideswipe the pole, and go through the bikes. Is it even looking? Driver disengaged, FSD didn't.

Bikes.png
 
Last edited:
  • Helpful
Reactions: diplomat33
I don't know if a human would call that "open" the sign was moved... just comparing to Frenchies older videos.

About driving policy, watching the "path line" (or whatever it is) it was jumping from the "open lane" to turning left or going straight, etc. It never had a committed "path". Definitely needs to be worked out!
If the road is not closed, it is open. That's my human opinion. If I was parking locally I'd go down it, or at least give it a closer look. If I was driving onwards I wouldn't bother with the narrow road. However it does look open.
 
I don't know if a human would call that "open" the sign was moved... just comparing to Frenchies older videos.

The road closed sign is in front of the divider area. It is not blocking the lane anymore. So the lanes were open. But it is possible that the vision was not able to determine that. From the car's perspective, it could have looked like the lane was still closed.
 
FSD Beta keeps wanting to pass cars that are stopped, even when it shouldn't. It almost feels like code from NOA where the car's logic is to always pass slower traffic in front to get you to your destination faster.

Yep. This and unprotected turns are the most dangerous errors the Beta makes. Fortunately, this should be easily solved for a wider release by requiring user confirmation or even takeover to perform these maneuvers. It wouldn’t be level 4, but it would be a reasonable facsimile of level 3 driving. Call it level 2.5.
 
Yep. This and unprotected turns are the most dangerous errors the Beta makes. Fortunately, this should be easily solved for a wider release by requiring user confirmation or even takeover to perform these maneuvers. It wouldn’t be level 4, but it would be a reasonable facsimile of level 3 driving. Call it level 2.5.
We'd actually just call it Level 2, defined as "You must constantly supervise these support features; you must steer, brake, or accelerate as needed to maintain safety."


j3016-levels-of-automation-image.png
 
FSD can't become even more expensive if there is any hope of addressing the real intent of FSD, which is to reduce traffic accidents and fatalities.

I've seen many people bring up the doomsday statistics in support of FSD: 1.3million traffic-related deaths globally, we need to take driving out of the hands of humans to solve this!

Reality is the vast majority of those fatalities are happening in countries that are far less wealthy and less developed, the stats are publicly available


The average vehicle on the road in these places might already be worth less than the $10k FSD price tag or cumulative cost of the subscription. No, we don't alleviate global traffic fatalities by putting this further from the reach of people where most of the accidents are happening. Either this remains ultra niche technology available only to the wealthy and thus has negligible impact on global traffic fatalities or autonomous vehicle tech needs to become much much cheaper.
 
FSD can't become even more expensive if there is any hope of addressing the real intent of FSD, which is to reduce traffic accidents and fatalities.

I've seen many people bring up the doomsday statistics in support of FSD: 1.3million traffic-related deaths globally, we need to take driving out of the hands of humans to solve this!

Reality is the vast majority of those fatalities are happening in countries that are far less wealthy and less developed, the stats are publicly available


The average vehicle on the road in these places might already be worth less than the $10k FSD price tag or cumulative cost of the subscription. No, we don't alleviate global traffic fatalities by putting this further from the reach of people where most of the accidents are happening. Either this remains ultra niche technology available only to the wealthy and thus has negligible impact on global traffic fatalities or autonomous vehicle tech needs to become much much cheaper.
I think (and Elon tweeted some time ago) that eventually it won’t be an option on any new Tesla but rather rolled into the purchase price. It will still be expensive but it won’t seem like it as much if it’s bundled.