powertoold
Active Member
Training.
On the surface, yes, but it's way more difficult and complicated than that.
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
Training.
It looks like the NN extrapolated the double yellow line through the tree causing the car to go around it to correctly position itself in the lane.
View attachment 722547
Ofcourse. Optimizing for rain and sun could be quite different and affect normal perception too. That’s where the Tesla secret sauce design matters.On the surface, yes, but it's way more difficult and complicated than that.
What about this wonky tree maneuver? Or the monorail? I think it's activated, but it's not very good or only maneuvers when confidence is high.
He did say it was “insane.” Though I think he meant that a different way!that's pretty gutsy to see what it does
My guess is that they’re still getting too many false positives so even if it’s active, they’re not using it to influence the car‘s decisions yet.Apparently it is not easy since they're not yet doing it.
When 99% of his viewers' comments are like this is there any wonder why he lets it drive like an a$$hole?
Not sure if the car was stopping because it saw the fence, or because of the cones with the fence that it saw. It was impressive how the car seemed to see the overhanging bush and dodged that, though. Maybe the “pixel height” thing is active, but they’re filtering stuff out due to false positives which is why the car will sometimes happily drive at poles and etc?
Edit - and this is the stuff that I hated about Beta videos. Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00. Much happier having Beta myself and knowing Beta is great when a driver is willing to intercede instead of letting Beta eff up for the views.
Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00.
That left turn at 7:00 didn’t look that bad. The was only going at 10 mph. The cars were coming from the other direction at low speeds too, looks like.do things it shouldn’t like the left turn at about 7:00.
It’s not cool to cut in front of people.That left turn at 7:00 didn’t look that bad.
I found AIDriver comment surprising: "That is fine for bay area driving." No I don't think so, that is an example of really bad driving. Although I did find that section of the video entertaining.... Guy let’s his car drive like an asshole and do things it shouldn’t like the left turn at about 7:00. ...
That left turn at 7:00 didn’t look that bad. The was only going at 10 mph. The cars were coming from the other direction at low speeds too, looks like.
I think this where its difficult to make out in the video. In real life, you can tell if the other driver is expecting you to turn or not. Not that FSD can make out - but the driver can let it go or slam on the brake based on that situational awareness.Similarly, that unprotected left was unacceptable IMO, despite the fact that it'd make it "safely."
I think this where its difficult to make out in the video. In real life, you can tell if the other driver is expecting you to turn or not. Not that FSD can make out - but the driver can let it go or slam on the brake based on that situational awareness.
ps : BTW, if its a really busy road and there are people waiting behind you, they can get impatient and aggressive if you don't take such turns.
If we get to see the results at all, the NHTSA investigation and information requests to Tesla + the other OEMs will probably be the best glimpse we get into the true limitations of the systems. Question 5.g. in the big letter is about the object and event detection and response capabilities within the operational design domain and specifically asks about limitations regarding various objects/scenarios.He did say it was “insane.” Though I think he meant that a different way!
It’s very odd seeing these people convince themselves the car is capable of things it is not actually capable of. It’s hard to tell from the guy in the video if he is being completely genuine, though.
The reality is that it is not well understood exactly what the Tesla can recognize - but we know for certain that there are many things it cannot recognize and respond to.
Guess I can use this opportunity to cross-post my (first and only?) FSD Beta video here: Wiki - MASTER THREAD: Actual FSD Beta downloads and experiences - starting on Friday, October 8th, 2021
As mentioned, use the bookmarks to skip around. Chapters are broken, whether it's due to mild profanity, lack of subscribers, or whatever.
I guess I could re-drive this section in future releases to see whether the neighborhood issues improve. I should re-drive it now, too, of course. Since it's like a box of chocolates.
Musk seems to think their latest 4D NNs can detect and avoid static objects, even without classifying them. AIDrivr's recent video shows his car swerving to avoid overhanging branches.If we get to see the results at all, the NHTSA investigation and information requests to Tesla + the other OEMs will probably be the best glimpse we get into the true limitations of the systems. Question 5.g. in the big letter is about the object and event detection and response capabilities within the operational design domain and specifically asks about limitations regarding various objects/scenarios.
But if you dig enough, it's apparent there is no controversy around object detection in the autonomous vehicle space. Problems with static object detection have been known for years, they exist across all systems, and it's clearly not an easy nut to crack. The idea that "NN retraining" or something will fix this or that Tesla came up with some game-changer between December 2020 -- when they told the regulators about these limitations -- and now, is just silly IMO. Tesla knew 11 months ago what could potentially be on the horizon.
I don't know how many people would have seen it already, but this was a conference back in 2018 held in conjunction with MIT and focusing on autonomous vehicles. Static object detection limitations are discussed in general terms at 1:24:24
I think the NHTSA investigation will show that the other systems share many of the same basic capabilities, but most of the other companies are being far more conservative with the maneuvers they'll allow their vehicles to attempt and in no small part to avoid the regulatory hammer coming down. Technical understanding of these systems is in extremely short supply and is super specialized, and only the people involved in the research will really understand what's happening behind the scenes.