Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.

Another video from the same people as one of the above videos. Nicely labeled incidents, etc., so easy to skip around the video to see what you want. Lots of work to do, obviously. Not sure I'd be quite so irresponsible with the Beta, though, as to let it get into situations where reactions from other drivers are required. It should be completely expected at this point that the Beta will get into accidents (only averted by other alert drivers) if left to do its own thing. Nothing wrong with that - it should be totally expected, since Tesla tells you it will get into accidents.

The dramatic "whoa" throughout the video is a bit much. It's no surprise that the FSD Beta is doing the wrong thing a whole bunch. I'm not sure why there is so much surprise and drama around it.

Hopefully the Button will come and a lot of new people can try it out - hopefully responsibly.

At the moment it looks about as I would expect in this sort of difficult driving environment. A big issue is obviously timidity - lots of honking at the Tesla due to hesitancy (can also see that in the drone videos above - assertiveness would be great, so as to get up to speed prior to entering the traffic lane - it seems really bad to have people passing the Tesla after making it all the way across in those drone videos - you don't want to have to have anyone slow down for you). Probably good to bias it that way of course, since the beta testers don't seem to be particularly quick to take over when it does the wrong thing. Makes you wonder though. Maybe a more assertive driving style would keep testers more on their toes. They would have to keep that foot right over the brake at all times!
 
Last edited:

Another video from the same people as one of the above videos. Nicely labeled incidents, etc., so easy to skip around the video to see what you want. Lots of work to do, obviously. Not sure I'd be quite so irresponsible with the Beta, though, as to let it get into situations where reactions from other drivers are required. It should be completely expected at this point that the Beta will get into accidents (only averted by other alert drivers) if left to do its own thing. Nothing wrong with that - it should be totally expected, since Tesla tells you it will get into accidents.

The dramatic "whoa" throughout the video is a bit much. It's no surprise that the FSD Beta is doing the wrong thing a whole bunch. I'm not sure why there is so much surprise and drama around it.

Hopefully the Button will come and a lot of new people can try it out - hopefully responsibly.

At the moment it looks about as I would expect in this sort of difficult driving environment. A big issue is obviously timidity - lots of honking at the Tesla due to hesitancy (can also see that in the drone videos above - assertiveness would be great, so as to get up to speed prior to entering the traffic lane - it seems really bad to have people passing the Tesla after making it all the way across in those drone videos - you don't want to have to have anyone slow down for you). Probably good to bias it that way of course, since the beta testers don't seem to be particularly quick to take over when it does the wrong thing. Makes you wonder though. Maybe a more assertive driving style would keep testers more on their toes. They would have to keep that foot right over the brake at all times!
Well it does keep your attention. I remember my Young Drivers teacher would occasionally reach over and pull the wheel partly onto the gravel shoulder and once he turned the car off. I wasn't very happy about it but he did prove a point about preparing for the unexpected.

Mostly these guys are doing well as they are expecting it to fail thus being prepared to take over. It's a tough call between too little and too much leniency when you're talking about split seconds.

Odd why some people have such trouble and some like Whole Mars have no trouble at all.
 
since the beta testers don't seem to be particularly quick to take over when it does the wrong thing.

We see this with TACC/AP/NoA as well. In fact I've been irked at myself for allowing NoA to continue along just so I could see figure out what it was thinking versus just giving up on it in some specific area/situation.

For TACC I use a larger setting than I really need because allocates time to TACC to start slowing down, and allocates time to me to take over from TACC if it ever doesn't slow down in its typical fashion.

If/When I do get the FSD beta my plan is to take over quickly, and not allow curiosity to get me into a bad situation.

Things like:

immediately taking over if the angle is wrong.
immediately taking over if there is indecision on which lane to take
immediately taking over if its time to go, and its not going

There are some situations that I probably won't try it in. Situations that don't offer much buffer between the time it will do something, and the time I need to counteract.

I'm neither a paid employee nor am I do particularly interested in being put into near miss situations.
 
  • Like
Reactions: andrewFW
#FSDBeta 8.2 - 2021.4.11.1 - 4K Drone View of Unprotected Left Turns with Forward Facing Traffic

Great! Thanks for posting that.

The #1 thing I want Tesla FSD Beta to do is to TELL us when it's going to go.

If I was training a new driver and they kept going or creeping at the wrong time I would eventually say "From now on you have to TELL me when you're going". This guessing as to its intent is just the worst ;)


#2 is to go at the right time...well perhaps that's #1
#3 is to go faster
 
Last edited:
#FSDBeta 8.2 - 2021.4.11.1 - 4K Drone View of Unprotected Left Turns with Forward Facing Traffic

9 Attempts

1 - Fail, seemed about to go in front of traffic
2 - Fail, seemed about to go in front of traffic
3 - Pass, wide gap, slow attempt, but made it
4 - Fail, entered roadway with oncoming car and seemed about to go into traffic
5 - Fail, seemed about to go in front of traffic
6 - 5 Reset - Pass, moderate gap, speed better than #3
7 - Pass, moderate gap, speed better than #3
8 - Pass, widest gap, speed better than #3. Accelerator intervention to unwind steering wheel. Turn signal was not on during the turn
9 - Fail, seemed about to go in front of traffic
 
  • Helpful
Reactions: scottf200
Thanks for doing these tests, but I got to say I would have much prefer Tesla use dedicated test drivers for these tests on public streets. Not everyone is going to be as careful as Chazman92, and all it takes is one slip of attention to cause a serious accident in these driving scenarios.
 
  • Like
Reactions: Matias
#FSDBeta 8.2 - 2021.4.11.1 - 4K Drone View of Unprotected Left Turns with Forward Facing Traffic

Great job on the videos with the drone! Super interesting.

It looks like there's been some kind of regression for left turns across oncoming traffic - I imagine they'll need to sort that out before any big widening of the beta. Or set up some system for requiring driver confirmation for the wider beta, kinda an extension of our traffic light & stop sign control
 

Another video from the same people as one of the above videos. Nicely labeled incidents, etc., so easy to skip around the video to see what you want. Lots of work to do, obviously. Not sure I'd be quite so irresponsible with the Beta, though, as to let it get into situations where reactions from other drivers are required. It should be completely expected at this point that the Beta will get into accidents (only averted by other alert drivers) if left to do its own thing. Nothing wrong with that - it should be totally expected, since Tesla tells you it will get into accidents.

The dramatic "whoa" throughout the video is a bit much. It's no surprise that the FSD Beta is doing the wrong thing a whole bunch. I'm not sure why there is so much surprise and drama around it.

Hopefully the Button will come and a lot of new people can try it out - hopefully responsibly.

At the moment it looks about as I would expect in this sort of difficult driving environment. A big issue is obviously timidity - lots of honking at the Tesla due to hesitancy (can also see that in the drone videos above - assertiveness would be great, so as to get up to speed prior to entering the traffic lane - it seems really bad to have people passing the Tesla after making it all the way across in those drone videos - you don't want to have to have anyone slow down for you). Probably good to bias it that way of course, since the beta testers don't seem to be particularly quick to take over when it does the wrong thing. Makes you wonder though. Maybe a more assertive driving style would keep testers more on their toes. They would have to keep that foot right over the brake at all times!

but but but @powertoold told me that in about 30 days, FSD Beta will surpass humans. That it will be able to go 150k miles between a disengagement that will lead to an accident. This prompted a 3 page discussion on 150k miles per disengagement.

Alas, @powertoold like elon musk when he knows he will be wrong, just makes another prediction.
Now the next guarantee is end of 2021, which is about 9 months away.
Both will fail and then he will make another one like clockwork.
 
Lol what, I never said that. Is anyone taking you seriously anymore? Tone down the strawman yo

On the other hand, it was surprising to know that there haven't been any accidents on fsd beta yet, whatever that means

Look at the context of your statement. It all started with @DanCar asking...
"When will it be safe enough?"

You replied with:
Probably somewhere around an accident probability of every 100-200k miles. This would be a ridiculous achievement. I don't know what Tesla's internal goal is.

DanCar responds with:
"That is three years away." Then you asked "What specific issue with the FSD beta do you think would take 3 years to improve?"
Prompting him to say "Nothing specific, just avoiding accidents except for every 150K miles. Wife had an accident running into a black curb on a black street at night. I've seen stop signs that are mostly occluded. What happens when teenagers prank tesla with adversarial attacks? Put some tiny white tape on a stop sign? Many interesting scenarios. What happens when a toddler runs into a street from a park chasing a ball? Most people would be on guard. What if there are bushes mostly occluding the view of toddler entering the roadway? My estimate / guestimate is three years. What is your estimate / guestimate for no accidents on average every 150K miles?"​

@diplomat33 also chips in response to DanCar's when will it be safe enough
"Yes, people will trust it more, that is why it needs to achieve an acceptable "driverless accident rate". In other words, you have to assume the driver will not pay attention, would the accident rate still acceptable? We don't know what magic number is. Tesla just says "safer than humans". So presumably, when Tesla's FSD achieves Tesla's internal metric, Tesla would probably remove driver supervision. We know Waymo is willing to deploy their cars in driverless mode in a geofenced area with an accident rate of 1 crash every 338k miles and 1 incident every 130k miles."

@Dutchie begins calculating
"If you start off with 1 intervention every 7 miles and every two weeks the number reduces by 1/3, than after 6 updates you are at one intervention every 156,865 miles..."

@DanCar responds saying his three years is based on
"Tesla 1/3 improvement every month in interventions
My assumptions are: start at 1 intervention every 5 miles. Every month improve by 1/3rd. 37 months before reach 150K miles intervention."

Clearly NO ONE is talking about human driver + FSD Beta stats like what Tesla claims to release quarterly. They are talking about how safe is FSD Beta by it self.

You came in and proclaim:
It's crazy I even think this, but 6-9 months lol. We've seen essentially every human maneuver from this beta, even unintuitive or difficult to program ones like:

This is when @DanCar responds to your post of 6-9 months by saying:

"I'm willing to take a bet on that. :) Tesla will not release level 3 or 4 wide release for cities in that time frame or that published city intervention rate is more than 150K miles by that time frame."

That should be the mic drop.. But i can add other posts like @BobbyKings responding to Dan's question to you by saying
"If you would assume an average 16 mile average intervention rate now, and a doubling of that every quarter then within 4 years you would achieve that. Lots of assumptions though, but assuming exponential improvement seams ok."

Everyone and their grandma knew what was being discussed.

and this continues for several more pages. When i called you out directly 2 weeks later, you didn't refuse that you are talking about safety disengagement that would lead to an accident every 150k miles. You actually accepted by saying the disengagement rate won't be linear.

To add more context to this as it relates to reliability compared to humans you said:

it's game over. Once the FSD beta is widely deployed, it's game over for sure. Tesla can't and won't widely release this FSD beta until it's at least as good as an average human. There's no reason to, as the liability would be greater than any FSD revenue benefit.

Tldr: if Tesla is *ever* able to widely release the FSD beta, it's game over for any competition.

But of-course months later when I revisited your statement, you completely denied it and moved the goal yet again

With these in mind, I'd say Tesla will achieve level 5 with average human level reliability by the end of this year, for sure.

When called out again, you backpaddled and gave a BS response that can't be proven, contradicting your original statement. This is coming from a guy who proclaims every other SDC company as being fake, marketing, lies, pr unless its system is in customers hands. But ofcourse Tesla is the exception. They are literally God incarnate. The arbiter of truth. So if they say this one car is L5. Ta Da.
Also, you're making assumptions about my prediction, since I don't give it full context. Level 5 doesn't mean it needs to be available to everyone. It just means I predict Tesla will have at least one car (maybe it's their test car) that:

Strawman? I think not.
I just pay attention to detail and call out the BS. Been doing it for over 5 years. This isn't new to me.
You weren't the first one to claim Tesla will have Level 5 by this date, its game over for the competition.
You won't be the last
 
I just pay attention to detail and call out the BS.

Lol, no you don't man. You constantly spew nonfactual information. You misinterpret things constantly.

Also, you keep using one video in one area (like downtown Chicago or Oakland) to make your point. Obviously no one is expecting fsd beta to be great in every location. It's about averages, not "let's find the worst area for FSD beta and use that to take a dump on it or someone".

Also, can't you just wait until 8.3 comes out before you bash me. All the stuff you quoted above is my opinion. I'm entitled to it, and it hasn't been 6-9 months since October 30 yet. Crazy man
 
Before anyone calls me a hater, I paid and upgraded to FSD/ HW3 this summer in my Model 3. I'm excited for the promise that has been made to me but I was also excited for the same promise when it was made to me 5 years ago after Tesla's sudden split with Mobileye forced them to create their own hardware.

I love Elon but I'm so tired of obviously bogus timelines. To be off by a year is forgivable. To be off by a couple years is forgivable. To be off by 5 years, have a clientele of hundreds of thousands of customers, many of whom have bought and subsequently sold a car and paid for a feature they never received is upsetting. And then you watch some of these "beta" videos and the car responds laughably bad. Not to mention it appears to be driving at a rate of speed that even if it did work, you'd probably want to take over control anyhow.

This isn't the first rodeo for some of us. What's different here is that Tesla has happily accepted money for FIVE YEARS for a feature that at best, is going to be half baked, at least in this hardware iteration. If you can't solve 100% of the use cases you don't have autonomy. So there goes the robo-taxi network for probably at least another 5 - 10 years, if ever.

Tesla has done a great job of muddying the waters with "FSD beta" but also using the term "Autosteer on City Streets". FSD beta isn't FSD. It's autosteer on city streets with driver intervention required. But the terms are interchanged enough that enough people assume what they're about to get is some FSD beta when in reality they're getting a half baked version of autosteer on city streets. So you're getting a partial version of a partial product. Meanwhile Tesla has your $10,000 and oh what's that? You want to transfer that purchase over to another car? Nah, we can't allow you to transfer your vaporwhere from one VIN to another. What a racket.

And for the apologists who want to scream "beta" as their excuse for everything. Beta isn't beta. Beta is a term to use to tell people that the product, as it exists, does not work in the means it was originally intended to but that it will eventually. But hey, we'll be nice and let you try it out until we iron out the kinks never.

For those people I have two questions:

1. Look at the Autosteer setting on an AP1 car. What's it say next to the toggle switch? Beta. It says beta. A feature suite that was never completed as advertised and never will be completed as advertised. 6 years later and their basic, original hardware autosteer is still in beta. What a joke.

2. Go into a parking lot and summon your car. See how that works for you. Spoiler alert: It doesn't. If your car is facing grass without a curb it will just pull up onto the grass. If another car even approaches it's path it panics and stops, resulting in every car around it stopping while they impatiently wait for the confused Tesla to figure out what it's doing. It won't.

So they've got a full self driving suite that's basically complete, which would lead you to believe that they should be able to push an update to existing cars to make smart summon not completely suck. The fact that they haven't done it means they either won't (unlikely) or can't (much more likely) and if it's the latter, then how close to full self driving can we possibly be?

So frustrated with this crap. I'm trying to buy a Model X right now and like an abused spouse I keep looking for FSD cars because I naively think it will eventually come despite knowing better.
 
Last edited:
Also, you keep using one video in one area (like downtown Chicago or Oakland) to make your point. Obviously no one is expecting fsd beta to be great in every location. It's about averages, not "let's find the worst area for FSD beta and use that to take a dump on it or someone".
Respectfully, it's not about averages. It's about edge cases and if you can't solve for every edge case then you don't have full self driving. What you have is this new made up nomenclature that Tesla calls Autosteer on city streets, which most people seem to think is full self driving when in reality it's just another hands on system, only one with exponentially more capability. Will it be cool in many situations? For sure. Will it be reliable all the time? No way and Tesla's clientele has demonstrated for a solid five years now that they will gladly hand over money for a product they won't ever receive, and in many cases they'll return and do it again. *points finger to self*