Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
On the surface, yes, but it's way more difficult and complicated than that.
Ofcourse. Optimizing for rain and sun could be quite different and affect normal perception too. That’s where the Tesla secret sauce design matters.

Low occlusion items are very tough to distinguish from partially erased road lines. This is probably the issue with a single pole sticking out of the road too.
 
I noticed FSD beta's BEVnet still doesn't handle "Bott's Dots" lane markers very well, which is odd because I thought California also used them all over the place.

1634499505248.png



I.e. the yellow markers are supposed to be viewed as "solid double" and the white line is dashed white. But on the visualization it rapidly flashes between solid and dashed. The old Autopilot lanes network had the same issues. It's not a big deal but it affects the legality of when you're able to pass.
 
that's pretty gutsy to see what it does
He did say it was “insane.” Though I think he meant that a different way!

It’s very odd seeing these people convince themselves the car is capable of things it is not actually capable of. It’s hard to tell from the guy in the video if he is being completely genuine, though.

The reality is that it is not well understood exactly what the Tesla can recognize - but we know for certain that there are many things it cannot recognize and respond to.

Guess I can use this opportunity to cross-post my (first and only?) FSD Beta video here: Wiki - MASTER THREAD: Actual FSD Beta downloads and experiences - starting on Friday, October 8th, 2021

As mentioned, use the bookmarks to skip around. Chapters are broken, whether it's due to mild profanity, lack of subscribers, or whatever.

I guess I could re-drive this section in future releases to see whether the neighborhood issues improve. I should re-drive it now, too, of course. Since it's like a box of chocolates.
 
Last edited:
  • Like
Reactions: Phlier
When you watch Green's video of the driver monitoring numbers, it's interesting how the NN continually fluctuates wildly even when there's a static picture. That how I see the path tentacle behaving as well - the car wavers between multiple paths.

Not understanding neural nets very well I still wonder why the output is not more consistent, or at least smoothed more. I'd like to see the path make a stable decision and stick to it, or at least don't fluctuate on a fractional-second basis.

 
  • Like
  • Love
Reactions: daktari and Phlier

Not sure if the car was stopping because it saw the fence, or because of the cones with the fence that it saw. It was impressive how the car seemed to see the overhanging bush and dodged that, though. Maybe the “pixel height” thing is active, but they’re filtering stuff out due to false positives which is why the car will sometimes happily drive at poles and etc?

Edit - and this is the stuff that I hated about Beta videos. Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00. Much happier having Beta myself and knowing Beta is great when a driver is willing to intercede instead of letting Beta eff up for the views.
 

Not sure if the car was stopping because it saw the fence, or because of the cones with the fence that it saw. It was impressive how the car seemed to see the overhanging bush and dodged that, though. Maybe the “pixel height” thing is active, but they’re filtering stuff out due to false positives which is why the car will sometimes happily drive at poles and etc?

Edit - and this is the stuff that I hated about Beta videos. Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00. Much happier having Beta myself and knowing Beta is great when a driver is willing to intercede instead of letting Beta eff up for the views.
When 99% of his viewers' comments are like this is there any wonder why he lets it drive like an a$$hole?

coke
I think the aggression is completely acceptable because FSD probably knows exactly how fast every car it sees is going and computes where they'll be in x time so it knows it can make certain turns. great vid
 
Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00.

It's just a matter of time before there's an accident caused by something like this. There's little evidence that many of these guys are monitoring the vehicle closely enough to intervene when it makes a mistake. All it would take is a Tesla gunning it from behind the van & Rav4 instead of a Prius.

With something like 2k public testers now, I'd be shocked if that doesn't happen before the end of this year. Of course we might not hear about it.

I definitely don't want this to happen, and it would be entirely the fault of the driver - not really the fault of FSD Beta, which is not capable of avoiding accidents, since it is not driving the vehicle.
 
That left turn at 7:00 didn’t look that bad. The was only going at 10 mph. The cars were coming from the other direction at low speeds too, looks like.

My 10.2 likes to make unprotected rights when the oncoming traffic is close and sorta slow. I know it can make it, especially with the boost, but it's not "natural" because people don't expect cars to gun it.

Similarly, that unprotected left was unacceptable IMO, despite the fact that it'd make it "safely."
 
  • Like
Reactions: ZeApelido
Similarly, that unprotected left was unacceptable IMO, despite the fact that it'd make it "safely."
I think this where its difficult to make out in the video. In real life, you can tell if the other driver is expecting you to turn or not. Not that FSD can make out - but the driver can let it go or slam on the brake based on that situational awareness.

ps : BTW, if its a really busy road and there are people waiting behind you, they can get impatient and aggressive if you don't take such turns.
 
  • Like
Reactions: powertoold
I think this where its difficult to make out in the video. In real life, you can tell if the other driver is expecting you to turn or not. Not that FSD can make out - but the driver can let it go or slam on the brake based on that situational awareness.

ps : BTW, if its a really busy road and there are people waiting behind you, they can get impatient and aggressive if you don't take such turns.

I kinda see what you mean. Basically there are situations in slower traffic where you need to just turn left and expect the other driver to stop because it'd be impractical to wait forever.
 
He did say it was “insane.” Though I think he meant that a different way!

It’s very odd seeing these people convince themselves the car is capable of things it is not actually capable of. It’s hard to tell from the guy in the video if he is being completely genuine, though.

The reality is that it is not well understood exactly what the Tesla can recognize - but we know for certain that there are many things it cannot recognize and respond to.

Guess I can use this opportunity to cross-post my (first and only?) FSD Beta video here: Wiki - MASTER THREAD: Actual FSD Beta downloads and experiences - starting on Friday, October 8th, 2021

As mentioned, use the bookmarks to skip around. Chapters are broken, whether it's due to mild profanity, lack of subscribers, or whatever.

I guess I could re-drive this section in future releases to see whether the neighborhood issues improve. I should re-drive it now, too, of course. Since it's like a box of chocolates.
If we get to see the results at all, the NHTSA investigation and information requests to Tesla + the other OEMs will probably be the best glimpse we get into the true limitations of the systems. Question 5.g. in the big letter is about the object and event detection and response capabilities within the operational design domain and specifically asks about limitations regarding various objects/scenarios.

But if you dig enough, it's apparent there is no controversy around object detection in the autonomous vehicle space. Problems with static object detection have been known for years, they exist across all systems, and it's clearly not an easy nut to crack. The idea that "NN retraining" or something will fix this or that Tesla came up with some game-changer between December 2020 -- when they told the regulators about these limitations -- and now, is just silly IMO. Tesla knew 11 months ago what could potentially be on the horizon.

I don't know how many people would have seen it already, but this was a conference back in 2018 held in conjunction with MIT and focusing on autonomous vehicles. Static object detection limitations are discussed in general terms at 1:24:24


I think the NHTSA investigation will show that the other systems share many of the same basic capabilities, but most of the other companies are being far more conservative with the maneuvers they'll allow their vehicles to attempt and in no small part to avoid the regulatory hammer coming down. Technical understanding of these systems is in extremely short supply and is super specialized, and only the people involved in the research will really understand what's happening behind the scenes.
 
If we get to see the results at all, the NHTSA investigation and information requests to Tesla + the other OEMs will probably be the best glimpse we get into the true limitations of the systems. Question 5.g. in the big letter is about the object and event detection and response capabilities within the operational design domain and specifically asks about limitations regarding various objects/scenarios.

But if you dig enough, it's apparent there is no controversy around object detection in the autonomous vehicle space. Problems with static object detection have been known for years, they exist across all systems, and it's clearly not an easy nut to crack. The idea that "NN retraining" or something will fix this or that Tesla came up with some game-changer between December 2020 -- when they told the regulators about these limitations -- and now, is just silly IMO. Tesla knew 11 months ago what could potentially be on the horizon.

I don't know how many people would have seen it already, but this was a conference back in 2018 held in conjunction with MIT and focusing on autonomous vehicles. Static object detection limitations are discussed in general terms at 1:24:24


I think the NHTSA investigation will show that the other systems share many of the same basic capabilities, but most of the other companies are being far more conservative with the maneuvers they'll allow their vehicles to attempt and in no small part to avoid the regulatory hammer coming down. Technical understanding of these systems is in extremely short supply and is super specialized, and only the people involved in the research will really understand what's happening behind the scenes.
Musk seems to think their latest 4D NNs can detect and avoid static objects, even without classifying them. AIDrivr's recent video shows his car swerving to avoid overhanging branches.