Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
Well, i just did this test for you, i drove over 80 on Friday and got AP jail. Just checked my car with 10.5, and have 0 strikes right now. So pheww!

Thanks. Fairly definitive, though of course they could have changed the criteria in 10.5, so testing with 10.5 would be the definitive test. Seems pretty likely that it behaves as you describe though.
 
okay drive for Dirty Tesla, except his car potentially would have gotten him killed at the end by apparently being happy to try to make an unprotected left in front of an approaching semi. Not good.
It absolutely could have made the turn if it had gone immediately after the car and at a reasonable acceleration.

However it hesitated, then went too late and too slowly. How can it not identify the timing? It's a clear sunny day and the truck is close, not speeding, and perfectly displayed.

It should be the easiest turn possible to calculate.
 
okay drive for Dirty Tesla, except his car potentially would have gotten him killed at the end by apparently being happy to try to make an unprotected left in front of an approaching semi. Not good.
Yesterday when I engaged 10.5 for the FIRST time it pulled up to a Stop sign and was to make a left. A similar truck was coming from my right (no Stop sign) and making a left in front of me. So of course it aggressively tries pulling right out into its path. Then of course on all the turns with NO traffic coming it is VERY cautious and slow.
 
It absolutely could have made the turn if it had gone immediately after the car and at a reasonable acceleration.

However it hesitated, then went too late and too slowly. How can it not identify the timing? It's a clear sunny day and the truck is close, not speeding, and perfectly displayed.

It should be the easiest turn possible to calculate.

It also can't figure out how to slow down properly for stop signs it can see perfectly in the visualizations, over two hundred feet away (it consistently slows down way too early, then creeps up to the stop line, requiring the use of the accelerator for nearly every single well-marked stop sign), so the lack of its ability to determine the correct timing for a turn does not surprise me.

I assume the latency on the visualizations is a GUI-only sort of thing. There's still 0.5-1 second of lag on the visualizations (and I swear it changes - in the video above it doesn't seem as long as I have seen at times), so I sure hope the vehicle is not making decisions based on the positions of vehicles in the visualizations! Weird that there is so much latency in rendering the visualization, to be honest. I wonder if they're applying filtering to the user-facing information, which would necessitate some extra delay. But why would they need to filter it?

Perhaps the cost function of the path planner detected that there was no passenger, so there was little cost to the vehicle occupants of being broadsided by a truck full of sheetrock?

Still think we're 12-24 months (at least) from a decent version of FSD. Just a very complicated & difficult problem, and it's not clear that it's even technically possible (for anyone!) yet in the general case - these issues are the (apparently) simple ones and they're still refining the detection and response to those simplest cases.
 
  • Like
Reactions: Phlier and K5TRX
It also can't figure out how to slow down properly for stop signs it can see perfectly in the visualizations, over two hundred feet away (it consistently slows down way too early, then creeps up to the stop line, requiring the use of the accelerator for nearly every single well-marked stop sign), so the lack of its ability to determine the correct timing for a turn does not surprise me.
That stop sign problems seems to me to be because of some kind of disjointed procedural code. It is as if the planner only works to bring the car to ~ 15mph a few dozen feet from the stop line. Then a different part of the code takes over and slowly moves the vehicle to the stop line.

Perhaps the cost function of the path planner detected that there was no passenger, so there was little cost to the vehicle occupants of being broadsided by a truck full of sheetrock?
I guess you are joking here - obviously any kind of crash would have a very high cost associated.

I can think of a few different reasons, though.
- Execution being slow i.e. some kind of lag between when the planner figures out to start moving and when a different part of the code actually makes the move
- Humans usually move much earlier than needed and give a lot of room between us and the oncoming car. I've seen this a few times with the FSD - it takes left turns later than I'd normally do but is actually safe. Here is an experiment you can do - after you turn left check the rear view mirror to see when the oncoming vehicle actually passes. Usually it is after a few seconds. In any case, the car should not take "tight" left turns, people are not used to it and will cause panic braking by the oncoming car too.

In the above example, it does look very close, so was not right.
 
  • Like
Reactions: loquitur
It also can't figure out how to slow down properly for stop signs it can see perfectly in the visualizations, over two hundred feet away (it consistently slows down way too early, then creeps up to the stop line, requiring the use of the accelerator for nearly every single well-marked stop sign), so the lack of its ability to determine the correct timing for a turn does not surprise me.

I assume the latency on the visualizations is a GUI-only sort of thing. There's still 0.5-1 second of lag on the visualizations (and I swear it changes - in the video above it doesn't seem as long as I have seen at times), so I sure hope the vehicle is not making decisions based on the positions of vehicles in the visualizations! Weird that there is so much latency in rendering the visualization, to be honest. I wonder if they're applying filtering to the user-facing information, which would necessitate some extra delay. But why would they need to filter it?

Perhaps the cost function of the path planner detected that there was no passenger, so there was little cost to the vehicle occupants of being broadsided by a truck full of sheetrock?

Still think we're 12-24 months (at least) from a decent version of FSD. Just a very complicated & difficult problem, and it's not clear that it's even technically possible (for anyone!) yet in the general case - these issues are the (apparently) simple ones and they're still refining the detection and response to those simplest cases.

The frame rate of the visualizations is quite low as well, which could contribute to the perception that things are lagging. Wonder if some day we'll have 60FPS.
 
  • Like
Reactions: K5TRX
The frame rate of the visualizations is quite low as well, which could contribute to the perception that things are lagging. Wonder if some day we'll have 60FPS.
The cameras (AR0132) max out at 45 fps at full res/FOV, Tesla is reading them in at 36 fps, the NNs are working at even slower speeds. Don't think it'll ever hit 60 fps with the current cameras and current HW.
 
  • Informative
  • Like
Reactions: nvx1977 and K5TRX
Here's a short clip of FSD Beta 10.5 in my Model 3 from this evening. This is fairly typical of my experience with both 10.4 and 10.5. High level of alertness required! Particularly concerning is the bike lane incursion at 0:42 and red-light braking at 1:05, but the car makes several mistakes throughout.


The most surprising thing to me is that these common mistakes seem to be more procedural programming mistakes than ML mistakes. (In fact it often seems like the two aspects are fighting each other.) The logic of how to handle a yellow light turning red, when the intersection is clear, should not be ML-limited. I'm sure the incident in my video has a corresponding "// this should never happen" in the codebase somewhere.

Some of the problem may also be a dearth of categories: if the NN isn't trained to recognize green bike lanes, it won't have a way to distinguish them from car lanes. If it isn't trained to recognize "No Right Turn on Red" signs, or if the map data doesn't specify, it'll try to make illegal right turns on red. (It does this every time at an intersection by my house.) It also doesn't yet seem to recognize or respect "merge" arrows, for instance, or signs above intersections (or painted on pavement) specifying which lanes turn which way. Perhaps once these pieces are trained and integrated, the car will become more reliable in these situations.
 
Last edited:

okay drive for Dirty Tesla, except his car potentially would have gotten him killed at the end by apparently being happy to try to make an unprotected left in front of an approaching semi. Not good.
As a 10.5 driver (San Francisco), I do appreciate this video, both as a practical offering and also as a morality play. I like Dirty Tesla's phrase "just go, you're [already] in the road". That's what I and driving partner believe and have to overcome, too. But there's too much dithering of both the car ego's (and
the human driver's) plans mid-maneuver. So, hysteresis obtains.

10.5 probably even calculates this as the right approach with the oncoming semi, but humans are generally too risk-adverse to even attempt the turn (except this was using the "assertive" setting).
I think it's a close call, in the manner of most disengagements where the human chickens out before
the automatons do (so that, except in simulation, we'll never really know).

Perhaps there should be something to match human psychology here, i.e.
that thing is 40-odd tons so don't even go there to try to jump the gun, even though a
plain truck or car might present a problem just the same. It's a "threshold" calculation,
even like at a "gore point", so what to do? "Chill/average" may never have attempted
such bravado, praise be. Like with early public FSD NOA, I bet Tesla sees a higher disengagement
rate with beta FSD NOA from humans than was predicted in simulation.
Human/machine symbiosis at work. Sorry for being so philosophical -- I'd
like less of a white-knuckle ride, too.
 
Last edited:
Still think we're 12-24 months (at least) from a decent version of FSD. Just a very complicated & difficult problem, and it's not clear that it's even technically possible (for anyone!) yet in the general case - these issues are the (apparently) simple ones and they're still refining the detection and response to those simplest cases.

Yeah I think we are still at least a year away from a decent usable version too. It seems to have all the tools it needs for Level 5 FSD, now it's just a matter of training it's decision making to drive better. It hesitates when it shouldn't, it makes questionable decisions, it chooses lanes poorly, BUT it seems to see mostly everything so the sensors seem adequate. This problem seems possible to solve given the current cars and hardware, it just hasn't been solved yet.

Once Dojo is up and running I have a feeling the training part of FSD will improve very quickly, and we'll start seeing better quality driving from it. Still, progress is clearly being made so Tesla seems to be on the right path. I think it's just a matter of time now, really.
 
  • Like
Reactions: Phlier and K5TRX
There is a section of road that FSD Beta fails dramatically and consistently. Sometimes it even just stops in the middle of the road at a 45 degree angle with no messages and refuses to move like it gives up.

I'm not a YouTuber and don't want a professional video setup, nor am I calling out Tesla (I've submitted snapshots and emailed since 10.2). 10.5 has been a measurable improvement but not for this section of road.

I'm up in NW Austin if anyone wants to try the road, but I have to caution you that you'll need to be prepared as it gets confused, you will need to slow down below the speed limit, and for some reason it doesn't stop for cross traffic either (I think it sees the cross traffic as a passing in a different lane).

I took it several times yesterday and it is so bad that I was cracking up. Elon hasn't pinned dropped this area yet. 30°25'42.6"N 97°50'29.7"W · Canyon Creek, Austin, TX

Driving westbound on Boulder out of the neighborhood (or could maybe start from the Church) with Nav set to Freebirds on 620. Be safe. The ends up turning to early, gets stuck in the left turn crossing and can't get out. Will not wait for cross traffic, enter the oncoming lane and either stop, charge into the closed parking lot at 45 mph thinking it is a road, or U-turn into a locked/gated apartment complex.
 
Last edited:
The most surprising thing to me is that these common mistakes seem to be more procedural programming mistakes than ML mistakes. (In fact it often seems like the two aspects are fighting each other.) The logic of how to handle a yellow light turning red, when the intersection is clear, should not be ML-limited. I'm sure the incident in my video has a corresponding "// this should never happen" in the codebase somewhere.
Checkout the AI day. You will get a better understanding of what parts are NN and how the cost-optimization path planner works.

Some of the problem may also be a dearth of categories: if the NN isn't trained to recognize green bike lanes, it won't have a way to distinguish them from car lanes. If it isn't trained to recognize "No Right Turn on Red" signs, or if the map data doesn't specify, it'll try to make illegal right turns on red. (It does this every time at an intersection by my house.) It also doesn't yet seem to recognize or respect "merge" arrows, for instance, or signs above intersections (or painted on pavement) specifying which lanes turn which way. Perhaps once these pieces are trained and integrated, the car will become more reliable in these situations.
Currently FSD Beta is Vision + Map based. But map is needed to figure out turn lanes, turn restrictions etc. FSD doesn't "read" those - just as it doesn't read road names. If there are map issues or FSD can't map what it says correctly with the 2-D map information, we get weird issues.
 
  • Like
Reactions: Ben W and Phlier
It was not funny. Childish I would say.
With all due respect to your opinion I'd like to confirm that you're aware that it's a parody of the CNN hit piece (which I should have mentioned). Here are three videos
1) The original CNN video
2) The video shot by the vehicle owner from the back seat during the same (CNN) drive
3) The parody of the CNN video.

I would argue that the original CNN video was so distorted it was almost a parody on it's own.

 
Last edited:
  • Love
Reactions: Phlier
With all due respect to your opinion I'd like to confirm that you're aware that it's a parody of the CNN hit piece, which I should have mentioned. Here are both videos (in between is the video shot by the vehicle owner from the back seat during the same drive). I would argue that the original CNN video was so distorted it was almost a parody on it's own.


the funny part was him imitating the original seat position and not being to get the visor to clear his head.
 
  • Funny
Reactions: EVNow and Phlier