Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
the driver was not prepared (complacent and wrong hand placement) and made 270 degree wheel turn to save it, with awkward positioning.

The car did terribly, but as far as I can tell FSD did not disengage. So the car made the turn. This driver is really trusting the process! :eek:

Handled like a champ, lol. Makes driving so much easier and less stressful. :rolleyes:
 
Anyone know/have a video of what "FSD" does when it encounters a yield sign, in the various cases of needing to yield vs not?
from watching vids of the car approaching roundabouts (most of which have yield signs for cars entering the roundabout), the car mostly will stop and treat it as a stop sign. I’ve seen it occasionally just enter properly if there is no impeding traffic, but mostly I’ve seen the car stop. And I honestly don’t know how many of the times I’ve seen it handle it properly were the car doing it by itself, or the driver hitting the accelerator. Most people taking vids will note when they hit the go pedal, but not always.
 
I'd like to vote for implementing Arizona rules: biggest car wins
Mass or volume?

We had a rule in college when we had to cross a big street- once the mass of the students was greater than the oncoming vehicle, you could just all go....

Now I want to see a Green video where he shows the Tesla estimating the mass of the other vehicle and feeding that into the path planning algo ;)
 
The car did terribly, but as far as I can tell FSD did not disengage. So the car made the turn. This driver is really trusting the process! :eek:

Handled like a champ, lol. Makes driving so much easier and less stressful. :rolleyes:
I had to watch it again because I was sure it was a disengagement. It handled it like a champ who just got rocked by an uppercut, staggered a bit but recovered.
 
  • Funny
Reactions: AlanSubie4Life
This driver is really trusting the process! :eek:
Is this the kind of "safe" driver we want testing FSD?

You're supposed to be attentive, and take over when AP doesn't do the right thing. Cranking the wheel towards a barrier is clearly the wrong thing. Do we think the FSD NDA tells them to do something different, or is this another case where if the car had darted into the barrier we would have called the driver an inattentive idiot that didn't read the manual?

The reality here is that he did try and take over- he just got lucky that it steered exactly with him. That was a disengagement in everything but the literal sense, yet it "handled it like a champ."

I also just noticed that it says "stopping for traffic control signal in 50 feet" - I guess referring to the pedestrian sign? Does FSD seriously do this for every crosswalk with a sign? There's about 10 crosswalks in the one street leaving my neighborhood.
 
Safety is of course paramount.
So do we consider every human that drives past a crosswalk without stopping to be a menace?
At some point these Teslas on FSD stopping at every yield and crosswalk sign, or creeping out then stopping to get a view will actually be dangerous since they behave so differently than human drivers.

"When driving, don't be nice. Be predictable."
 
So do we consider every human that drives past a crosswalk without stopping to be a menace?
At some point these Teslas on FSD stopping at every yield and crosswalk sign, or creeping out then stopping to get a view will actually be dangerous since they behave so differently than human drivers.

"When driving, don't be nice. Be predictable."
This had always been an issue with L4 vehicles, the early Cruise cars here in SF were like this. Overly cautious and unpredictable (at least in the sense they don't act like human drivers do) and this made taxi drivers' blood boil when they end up on the same street. But you can't actually say it's "unsafe" as it does minimize the chance of a crash from making a mistake, even though it may irritate every driver on the road.

Of course the FSD Beta cars are not L4, so actually they can probably err on the side doing less of that, although currently Tesla's production system still does those confirmations AFAIK (I believe it only skips it if there is a lead vehicle).
 
But you can't actually say it's "unsafe" as it does minimize the chance of a crash from making a mistake, even though it may irritate every driver on the road.
I'd think only long term statistics would prove that. While the vehicles themselves may hit something less often, if they are hit or cause other accidents, then as a society we may have moved backwards. The system we are trying to solve here is overall human harm, not just people in autonomous cars.

although currently Tesla's production system still does those confirmations AFAIK (I believe it only skips it if there is a lead vehicle).
Are you saying Tesla's "FSD" will drive through a crosswalk or yield if there is a lead vehicle, but stop if there is not and requires human intervention? That is suddenly way less "FSD" than even I thought as a skeptic. I guess all these FSD video drivers are constantly tapping the accelerator and not mentioning it? It also means that it will totally break the law sometimes when the lead vehicle can make it fine but you are supposed to stop.
 
I'd think only long term statistics would prove that. While the vehicles themselves may hit something less often, if they are hit or cause other accidents, then as a society we may have moved backwards. The system we are trying to solve here is overall human harm, not just people in autonomous cars.
Sure, that's true, but that is counted as a no-fault accident and the other driver would be at fault.
Are you saying Tesla's "FSD" will drive through a crosswalk or yield if there is a lead vehicle, but stop if there is not and requires human intervention? That is suddenly way less "FSD" than even I thought as a skeptic. I guess all these FSD video drivers are constantly tapping the accelerator and not mentioning it? It also means that it will totally break the law sometimes when the lead vehicle can make it fine but you are supposed to stop.
By production I mean the public release, not the beta. When I had my FSD trial at the beginning of the year, the car asked for confirmation through every light and only didn't when there was a lead vehicle. I don't watch FSD Beta videos enough (nor have I tried it myself obvious) to determine if the behavior is the same and also not sure if an update may have changed this in the production version.
 
I'm looking forward to new FSD beta videos this weekend (and hopefully to real progress).

1631199275681.png
 
Called it! Two weeks and two days ago:
I'll believe it when he goes 6 months without something crazy, not just 1 week. As it is now, we're just as likely to get a "9.4 is 🔥 and will blow your mind" two weeks from now as we are "9.4 is a nice small step forward, progress continues"

I'll repost this while I'm here too:
I'm guessing Elon's experience with any "FSD" version goes this way:

1) He gets it before anyone else. He drives it on limited routes in the LA or Brownsville areas that engineers know he will drive. It works pretty good in those 30 minutes (he's a busy guy, let's not pretend he ever drives it more than that). He tweets that it's awesome.

2) It goes to "FSD" beta testers. They use it in actual, hard traffic and city streets, not just highway entrances. They post YouTube videos. It's not great. Elon sees this and adjusts his opinion and resets his expectations for the next version (but usually doesn't tweet that)

3) Repeat #1, where now his drive is better than the YouTube videos of the previous version, so it must be amazing, because he forgot he already said the previous version was amazing before he learned that it was not.
 
I'll repost this while I'm here too:
I'm guessing Elon's experience with any "FSD" version goes this way:

1) He gets it before anyone else. He drives it on limited routes in the LA or Brownsville areas that engineers know he will drive. It works pretty good in those 30 minutes (he's a busy guy, let's not pretend he ever drives it more than that). He tweets that it's awesome.
This implies engineers are hand tuning it to fit his roads specifically, but from what had been seen in presentations this is basically impossible. It is however likely the NNs are "overfit" to the roads local to Tesla, given their internal testing is all done there (this would naturally happen without needing any deliberate action from engineers). But it isn't something you can really tweak by hand (only to some of the decision making parts that are still hand coded, but earlier in the year they have already started migrating that to NNs).
 
likely the NNs are "overfit" to the roads local to Tesla
But it isn't something you can really tweak by hand
Never said they tweaked it by hand. But "overfitting" or making sure it has tons of training data for specific roads is still something humans can do to improve performance in specific areas that they want to focus on. In fact, it's one of the things you have to be actively careful of when training an NN, there's all sorts of bias you can introduce because of the training sets you choose. Just the bias of engineers being in CA and experiencing failures personally and focusing on fixing those is a kind of bias.

What's your opinion of why Elon can experience 8.X or 9.X, call it fire and "blow your mind" and then within 60 minutes of the release we have videos of it driving straight for concrete pillars, and then a few weeks later Elon is calling it "not that good?"
 
I'm sure all of Elon's typical routes have been labeled and used in training many times over. So the model's are strongly overfit to what Elon uses / likes. If Elon your boss says, can you make it do 'X'? Every engineer is going to respond yes sir, right away sir. None of us will get that luxury. Perhaps yes in the long term, with a long list of custom settings.
 
  • Like
Reactions: Matias
Nice to get direct screen feed videos but disappointing to still see large trucks jumping around in the visualization with FSD Beta 9.2, which should be using Vision-only builds:

dupe truck.jpg


Although potentially that's what Elon Musk meant by combining the highway and city street stacks where production (highway) Vision-only still focused primarily on forward cameras to predict velocity and city streets was trained on all cameras for consistent birds-eye-view, so hopefully Beta 10 will fix this.
 

Is it expected that one day the visualisations will be steady and glitch free?

I have never understood what the need is to display visualisations that glitch if the car is working to a stable interpretation for FSD / AP etc that could be used to feed the driver display. A bit of lag is no problem, but why jumping around - unless that's the best visualisation that exists?

With all of the car's AI and NN processing, how can it believe that a vehicle can jump around like that?