Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
It thinks it can still make it around that truck! I feel like the path planner may not take into account the velocity of other vehicles? Seems like a bit of a limitation.
Ofcourse planner takes into account the velocity of other vehicles. Otherwise how will it ever judge when to turn ? Prediction of what other vehicles are going to do (at a minimum vehicle path and position in time taking into account the current speed and acceleration) in needed for driving. Advanced prediction would include what other vehicles would do based on your actions.
 
  • Funny
Reactions: AlanSubie4Life
It thinks it can still make it around that truck! I feel like the path planner may not take into account the velocity of other vehicles? Seems like a bit of a limitation.
It's hard to tell if it's a dip in the road or if the truck is braking just before it enters the intersection, but the front of the truck dips a couple of times. It could be the planner had calculate when the truck would pass before the braking and had started to roll in anticipation. I had that happen to me on an unprotected left, but in my case the car stopped completely before it entered the intersection. Beta aborted the turn and came to a stop.

If you watch the video at 1/4 speed, you can see that the steering wheel has turned back to the right before Beta is disengaged. It's farther to the right than when it was waiting to turn, so it might have corrected itself. Just before it disengages, you can see that the driver jerks the wheel even further to the right. I certainly wouldn't have waited to find out.
 
It could be the planner had calculate when the truck would pass before the braking and had started to roll in anticipation.
Except the planner plotted a course in front of the truck in the visualization. You’d expect that it would plot a course behind it if that were the plan. I wonder if they include “broadside by Dodge Ram” in the cost function? May need to add some additional weight on that branch of the decision tree; even if it results in a sublime turn, the decision may be ill-advised.
 
  • Like
Reactions: Phlier
Which in this case manifests as not taking into account vehicle velocity (effectively), even if that’s not what is actually happening. It doesn’t really matter to the end user what the root cause is of course. Root cause only matters if you’re trying to fix it!
But we are playing "figure out the root cause" here all the time ;)

It seems to me the truck may have slowed down a bit ? Anyway - good we are not in charge of fixing such bugs :oops:
 
Except the planner plotted a course in front of the truck in the visualization. You’d expect that it would plot a course behind it if that were the plan. I wonder if they include “broadside by Dodge Ram” in the cost function? May need to add some additional weight on that branch of the decision tree; even if it results in a sublime turn, the decision may be ill-advised.
The car doesn't take every path that the planner displays on the screen. I mean that thing bounces all over the place as it considers various solutions. It could have noticed the truck was slowing and considered whether it could make it ahead of it.
 
IMO, (which is probably completely wrong), the bug lies in the classification of cars between being "active traffic" and "parked car."

This situation looks similar to one that has been happening to me quite a bit lately... the car will properly follow a car in front of it. The preceding car stops to yield to traffic before making a right turn. My car waits about ten seconds, then decides to re-categorize the preceding car from "active traffic" to "parked car," at which point my car will try to go around it.

It kinda looks like, given the path prediction vector, FSD decided the truck is now classified as "parked car" and decides to "go around" it, even though it's still moving.

What do you guys think? Is this bug potentially serious enough to be considered a "show stopper?" The more instances of it I've had, and the more I see it posted here and on YouTube, the more I'm leaning toward yeah.... this isn't good at all, and Tesla might want to consider pulling it until it's fixed.

I'm not trying to make anyone mad here, and I understand this has the potential of getting more than a few pieces of underwear twisted. Let's try and voice our opinions in a reasonable manner, yeah? ;)

And while I do understand guys taking the opinion of "well, as long as you're doing your job monitoring, it should be ok." Agreed, but there are too many people out there that aren't doing a good enough job of monitoring, and this particular bug can put you in a world of hurt so fast that even the best monitor in the world might not be able to recover in time.

I dunno... maybe I'm way off base here, but this looks potentially very serious.

Personally, I'm done using Beta FSD until this one gets resolved.
 
Last edited:
IMO, (which is probably completely wrong), the bug lies in the classification of cars between being "active traffic" and "parked car."

This situation looks similar to one that has been happening to me quite a bit lately... the car will properly follow a car in front of it. The preceding car stops to yield to traffic before making a right turn. My car waits about ten seconds, then decides to re-categorize the preceding car from "active traffic" to "parked car," at which point my car will try to go around it.

It kinda looks like, given the path prediction vector, FSD decided the truck is now classified as "parked car" and decides to "go around" it, even though it's still moving.

What do you guys think? Is this bug potentially serious enough to be considered a "show stopper?" The more instances of it I've had, and the more I see it posted here and on YouTube, the more I'm leaning toward yeah.... this isn't good at all, and Tesla might want to consider pulling it until it's fixed.

I'm not trying to make anyone mad here, and I understand this has the potential of getting more than a few pieces of underwear twisted. Let's try and voice our opinions in a reasonable manner, yeah? ;)

And while I do understand guys taking the opinion of "well, as long as you're doing your job monitoring, it should be ok." Agreed, but there are too many people out there that aren't doing a good enough job of monitoring, and this particular bug can put you in a world of hurt so fast that even the best monitor in the world might not be able to recover in time.

I dunno... maybe I'm way off base here, but this looks potentially very serious.

Personally, I'm done using Beta FSD until this one gets resolved.
I've written my fair share of software. Most of what is being discussed here goes beyond 'bug' in the software. A bug is miscalculating the truck's speed due to some incorrect code. These problems are flaws and oversights, could even be intentional behaviors (like the California Stop). I do see having the left turn signal flashing while the car makes a right turn to be a bug, I think.
 
I was kinda hoping it would rate FSD on safety, coz that biotch drives crazy. Talk about hard braking and aggressive turning! I'm looking forward to watching my wife's reaction when she comes along - she'll scream. On a positive note, it did an excellent job giving way to oncoming residential traffic while steering around leaf piles. Stops signs are goofy. In my neighborhood, it stopped about 1 house away from the corner and did the creep forward, while the driver behind me was like WTF are you doing??? LOL.

My wife rode with me today for the first time since I got the FSD 10.2 beta last week. She said the same thing about how Tesla should be doing the safety rating on the FSD system. My wife kept saying "WTF is it doing?" Needless to say, we submitted quite a few autopilot snapshots today... :)
 
It's a neural network, so there's no code in predicting the speed or trajectory.
Hmmm. Maybe I have it wrong (as is often the case), but I thought that one of the neural networks jobs is to decide what code to run according to how the NN classifies the object in question. Like car sees object, NN classifies it as "parked vehicle" and decides to run the "go around parked vehicle" procedures/code.

In one of the videos I watched where the labels had been added to the visual camera feed, it was interesting to see the labels applied to objects, as well as such things as the object's speed, distance, direction of travel, etc. It only added those additional labels if the vehicle wasn't classified as "parked vehicle."
 
I thought that one of the neural networks jobs is to decide what code to run according to how the NN classifies the object in question. Like car sees object, NN classifies it as "parked vehicle" and decides to run the "go around parked vehicle" procedures/code.

‘Not sure what you’re referring to here, but Tesla uses human-created procedural code along with NN predictions.

if (car parked) and (enough room to fit) then drive pass

where the statement in parenthesis comes from the NN predictions
 
WOW...enhanced autopark using the cameras is very impressive. It's shown as the first segment in this video and is a significant change! I'll look forward to more tests and to trying it myself. I've never bothered to use autopark other than to experiment with it...this may actually be useful now.

Rob from the Tesla Podcast/YouTube channel "Tesla Daily" showed a video of using it at night in an empty parking lot. It took it a bit to complete the maneuver, but the resulting parking job was perfect.

Glad to hear others are having similar success with it.
 
  • Like
Reactions: n.one.one