Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD Beta Videos (and questions for FSD Beta drivers)

This site may earn commission on affiliate links.
I think they are overwhelmed by data on failures at the level of (low) quality FSD are now.
Having a team of 50 testers, seeking different cities, would be more than enough. Each of these days testing would net them a few hundred issues pr driver. When they don't report any failures for weeks, one can spread out testing further. But they are not there yet
Maybe the main reason for public beta is cost reduction, but I think that is the ancillary reason. Publicity, promises made, and keeping the innovation hungry fans satisfied is the main reason.

this seems unlikely. There's no reason Tesla would provide the ability to "overwhelm" themselves with snapshot button data to 60k testers if their main motivation was for publicity/promises made. I would be more inclined to agree if starting with the safety score cohort of testers, they didn't give us the ability to snapshot and kept that feature only with employees and the original early access cohort.
 
A lot of the disengagements are from apparently bad map data where the car thinks it is in the wrong lane when it is not. We also see the car not slowing down like it should or not yielding to cross traffic.

 
A lot of the disengagements
Do you mean disengagements initiated by the driver or the red steering wheel initiated by the program?
...wrong lane...
I am not sure about strict adherence to lane mapping because Autopilot/FSD/FSD beta have been able to drive on the WRONG lane fine which is re-routed by construction. My 2018 Autopilot was driving North on I-5 in Los Angeles in the physically wrong Southbound lane but that's legal because the construction re-routed one Southbound lane temporarily for northbound traffic. Now, my 2022 10.9 FSD beta does the same thing on CA-99 in Bakersfield construction, legally driving in the wrong lane as directed by construction.

That sounds more like Vision rather than pre-mapping.


WKVt0wN.jpg



...We also see the car not slowing down like it should or not yielding to cross traffic...

Tesla does not do pre-mapping because, in theory, Pure Vision is superhuman.
 
Do you mean disengagements initiated by the driver or the red steering wheel initiated by the program?

All the disengagements were initiated by the driver because the car did something he deemed wrong or unsafe.

I am not sure about strict adherence to lane mapping because Autopilot/FSD/FSD beta have been able to drive on the WRONG lane fine which is re-routed by construction. My 2018 Autopilot was driving North on I-5 in Los Angeles in the physically wrong Southbound lane but that's legal because the construction re-routed one Southbound lane temporarily for northbound traffic. Now, my FSD beta does the same thing on CA-99 in Bakersfield construction, legally driving in the wrong lane as directed by construction.

That sounds more like Vision rather than pre-mapping.

In the video, FSD Beta wants to make lane changes that don't make sense. For example, on a straight road with no traffic, it wants to move over into the left lane to "follow route" when there is no need to be in the left lane at all. Another time, FSD Beta inexplicably changes lanes into a left turn only lane when it needs to go straight. The driver in the video blames it on bad map data.
 
All the disengagements were initiated by the driver because the car did something he deemed wrong or unsafe.
Thanks! That's much clearer because other threads complain about the red steering wheel error.
In the video, FSD Beta wants to make lane changes that don't make sense. For example, on a straight road with no traffic, it wants to move over into the left lane to "follow route" when there is no need to be in the left lane at all. Another time, FSD Beta inexplicably changes lanes into a left turn only lane when it needs to go straight. The driver in the video blames it on bad map data.
That has been happening since day one of Navigation on Autopilot (NoA). So FSD beta only carries it over.

For Navigation to turn left or right beyond a straight within a lane, it does depend on GPS data. So good luck with an exit if it was moved by construction!

The importance of GPS data was also discussed in:


In the picture below, the car was actually on the right-hand side of the screen where the freeway was but the GPS data said the red triangle car icon was approaching the 3-way t-intersection in the city streets so the car was driving at 70 MPH but was erroneously and automatically reduced down to 25 MPH "Stopping for traffic control in 100 ft":

1640726734620-png.749329


In summary, Tesla not only uses vision but also GPS data. Its GPS data might not strictly adhere because it can drive in the wrong direction lane (some report of crossing double yellow line to aim at opposing traffic). On the other hand, bad GPS data can mess up the speed and stop sign compliance (some report of running stop signs), and among other things.
 
The goobers slamming him in the comments section need to check themselves.

But this technology having issues with static objects shouldn't be news to anyone who follows it closely. There's another person in the comments who says their vehicle clipped a construction barrier on Beta, who knows what else has happened that isn't on video.


So it is AI Addict himself who worked for Tesla, but he doesn't anymore
 
  • Like
Reactions: Terminator857
but it is news. The radar was blamed so it was taken off and replaced with pure vision to prevent such a scenario. Now, radar is off, pure vision is here, so this is definitely news.
Issues with static object detection predate Vision-only though, these problems are well known in the industry and have been for a long time. Whether Vision-only makes it worse, I dunno but I understood this as an issue tied more to world modelling than anything
 
I think they are overwhelmed by data on failures at the level of (low) quality FSD are now.
Having a team of 50 testers, seeking different cities, would be more than enough.
LOL. So 50 people to get real world data from 200 cities is "enough" ? I'm so glad you are not heading the FSD team ;)

BTW, for anyone saying "too much data" I want to remind you filtering is a lot easier than realistic data generation.
 
but it is news. The radar was blamed so it was taken off and replaced with pure vision to prevent such a scenario. Now, radar is off, pure vision is here, so this is definitely news.
Doubt radar would have made any difference since the bolsters are just a light weight plastic. Radar would be absorbed or pass through and not reflect unless the paint on them has lower wavelength reflectivity.
 
Doubt radar would have made any difference since the bolsters are just a light weight plastic. Radar would be absorbed or pass through and not reflect unless the paint on them has lower wavelength reflectivity.
It is really puzzling how vision would not be able to identify bollards. It is not like they blend in at all. So odd that they haven't trained for bollards ... but maybe it is the green color vs normal yellow ones?

YgKTcnI.jpg
 
  • Like
Reactions: Matias
It is really puzzling how vision would not be able to identify bollards. It is not like they blend in at all. So odd that they haven't trained for bollards ... but maybe it is the green color vs normal yellow ones?

YgKTcnI.jpg
Keep in mind that our brains automatically render them as 3D objects (which is so strong and automatic we can be fooled by flat pantings on the road). Beta is probably labeling them sometimes as flat lines panted on the road and other times trying to label them as red cones.

Screen Shot 2022-02-08 at 12.50.17 PM.png
Screen Shot 2022-02-08 at 12.55.21 PM.png
 
Last edited:
Keep in mind that our brains automatically render them as 3D objects (which is so strong and automatic we can be fooled by flat pantings on the road). Beta is probably labeling them sometimes as flat lines panted on the road and other times trying to label them as red cones.
It just is wildly strange to me after watching every Andrej Karpathy and FSD/AP video and presentation that bollards would be like 101 stuff. It seems so basic by comparison to what they have demonstrated in presentations that they can do and pick up. I can imagine they are setting up a few 'triggers' so they get auto captured for the 'send to mothership' upload and AI/NN training that has been discussed (specifying narrow triggers).
 
  • Like
Reactions: Matias
It just is wildly strange to me after watching every Andrej Karpathy and FSD/AP video and presentation that bollards would be like 101 stuff. It seems so basic by comparison to what they have demonstrated in presentations that they can do and pick up. I can imagine they are setting up a few 'triggers' so they get auto captured for the 'send to mothership' upload and AI/NN training that has been discussed (specifying narrow triggers).

In the other scene with the bollards, you can see them visualized as either cones or gray pylons (look carefully to the right and behind the car graphic). I wonder if it lost them in a blind spot after it got too close

Screen Shot 2022-02-08 at 10.39.31 AM.png
 
  • Helpful
Reactions: scottf200
In the other scene with the bollards, you can see them visualized as either cones or gray pylons (look carefully to the right and behind the car graphic). I wonder if it lost them in a blind spot after it got too close

View attachment 766436
766436


Those in-road bollards are getting more common to protect bike lanes and other things. What happens if the middle one gets wrecked, do cars sneak through the gap?

Be glad you don't have in-road concrete like this, admittedly the construction is not finished yet but it's a bit of a hazard as the road is open like this. I wonder if FSD Beta would avoid the curb if the warning sign got tipped over?

1qogc1yt3n281.jpg