Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Wiki MASTER THREAD: Actual FSD Beta downloads and experiences

This site may earn commission on affiliate links.
interesting. I wonder what they mean by 'navigation/maps' for disengagement reason? In my limited experience 10.10.2 is about the same as 10.8 but I haven't been able to test much because of weather. It does seem to be worse at unmarked roads.
I'd put these as navigation/maps
- Selecting wrong lanes for turning
- Approaching a roundabout fast because it is not mapped
- Trying to turn into roads that don't exist
- Very slow to start at intersections (because the reality doesn't match with maps)

Could put school zones also as navigation/maps (basically a missing feature), though we should really have a separate category for that to cover things like
- Not able to handle multi-lane roundabouts
- Slowing at School Zone
- Using the middle turn lane for unprotected left
 
Well I guess that means you're off the prospective Robotaxi list:)
Well, the “best” geofenced robotaxis don’t even handle rain. Let alone ice/sleet snow.

BTW, when the road edges were covered with snow here, FSD just treated the edge of the piled up snow as the new edge of drivable space. So, it worked ok in some places and not at all in others. Ofcourse the bigger issue is obscured lines.

FSD can handle unmarked roads and marked roads. Partially marked, as happens with snow, would need a whole new set of training data.
 
Last edited:
Definitely some kind of interpretation is happening. Even if the camera is high enough to see some parts of the road - there is no way it can see all the lines, even the one right next to the barrier.

Yes. I think it was mentioned during AI Day that Tesla has trained NN to extrapolate the shape of intersections and roads based on what vision is able to see. So vision is seeing the two lane lines and a divider and is likely extrapolating that there must be 2 lane lines on the other side.

Interpolating the road lines to fill in sporadic missing data seems reasonable. Extrapolating the road lines if it then makes decisions off this information is risky. Imagination is not the quality I want in a self-driving car. No evidence has been given on whether it bases driving decisions from "made-up" data yet. Perhaps the uncertainty of extrapolated data is taken fully into account.
 
Well, the “best” geofenced robotaxis don’t even handle rain. Let alone ice/sleet snow.

BTW, when the road edges were covered with snow here, FSD just treated the edge of the piled up snow as the new edge of drivable space. So, it worked ok in some places and not at all in others. Ofcourse the bigger issue is obscured lines.

FSD can handle unmarked roads and marked roads. Partially marked, as happens with snow, would need a whole new set of training data.
Everyone focuses on the lack of marked lines when snow covers them. Frankly I see that as something FSD can probably handle but there are so many other snow and black ice related variables that support for a snowy climate is pretty far away. Sometimes driving in the snow is an "art" more than "science" and that is really hard for even AI to handle. Such a simple item as adjusting the route to avoid really hilly roads can be critical. Like the 13% grade hill 100 meters from my house. Two weeks ago a new neighbor not used to driving in the snow slid down the hill and ran over a neighbors mailbox. They now know to circle the block and come up the hill to make the turn halfway up. And I won't even go into the covered cameras with wet snow and/or ice.
 
Imagination is not the quality I want in a self-driving car.

I think imagination is actually an asset of a safe human driver, and it's something that's going to be difficult but necessary to replicate if we want to achieve level 5 autonomy.

A safe driver is constantly aware not only of their immediate surroundings, but also aware of things they can't see and imagined future possibilities. Blind corners, high hill crests, or even anticipating the behavior of aggressive drivers all rely on context-specific intuition and imagination.
 
  • Like
Reactions: Nakk and Dan D.
I think imagination is actually an asset of a safe human driver, and it's something that's going to be difficult but necessary to replicate if we want to achieve level 5 autonomy.

A safe driver is constantly aware not only of their immediate surroundings, but also aware of things they can't see and imagined future possibilities. Blind corners, high hill crests, or even anticipating the behavior of aggressive drivers all rely on context-specific intuition and imagination.
Good point. Implementation is the challenge.

Ask someone who just crashed and you might hear "I thought it was safe because I didn't see headlights", "There's hardly ever any cross-traffic here", "I figured the truck would let me merge because we're all nice around here". Lots of decisions based on lack of data. Lack of data and faulty assumptions is at least some proportion of the reason for crashes.

If we're aiming for less crashes we need adequate data, and perhaps no/low assumptions or at least well-reasoned assumptions.

My point in relation to the x-ray vision of FSD through concrete barriers in showing lane lines on a road that it has invented entirely (which may exist but it doesn't KNOW it exists). Unless it has actual map data or partial vision in which case there may be some justification or it could be guessing that because it sees cars just above the level of the concrete barrier there must be a road, a reasonable guess but still a guess; does the road disappear when there are no cars - as it should. But showing something that it has invented just because "there's probably a road there" presents risk. As with all things, if it's done well then the risk is managed.
 
Everyone focuses on the lack of marked lines when snow covers them. Frankly I see that as something FSD can probably handle but there are so many other snow and black ice related variables that support for a snowy climate is pretty far away. Sometimes driving in the snow is an "art" more than "science" and that is really hard for even AI to handle. Such a simple item as adjusting the route to avoid really hilly roads can be critical. Like the 13% grade hill 100 meters from my house. Two weeks ago a new neighbor not used to driving in the snow slid down the hill and ran over a neighbors mailbox. They now know to circle the block and come up the hill to make the turn halfway up. And I won't even go into the covered cameras with wet snow and/or ice.

Also it has no concept of slowing down (beyond what it does when it can't detect lane lines on an unmarked road) for snow/ice. Also, jerky motions do not inspire confidence when traction is already limited.
 
  • Like
Reactions: sleepydoc
Also it has no concept of slowing down (beyond what it does when it can't detect lane lines on an unmarked road) for snow/ice. Also, jerky motions do not inspire confidence when traction is already limited.
That's my experience as well. Part of winter driving is assessing the road conditions and adjusting your driving style. FSD, AP and TACC can't and don't do this. Realistically, I've never even used cruise control when I'm concerned about driving conditions so I have no expectation of even using TACC, much less AP or FSD in such conditions.
 
  • Like
Reactions: aronth5
That's my experience as well. Part of winter driving is assessing the road conditions and adjusting your driving style. FSD, AP and TACC can't and don't do this. Realistically, I've never even used cruise control when I'm concerned about driving conditions so I have no expectation of even using TACC, much less AP or FSD in such conditions.

Forget winter. It doesn't slow down in hairpin curves on good pavement. I've got some around here that drop from 40/45 to 15. There has been a change in FSD. It used to not slow down at all. Now it does in the middle of the curve which isn't the greatest idea traction wise. Maybe the accelerometers force it now. Plus lately I'll dial down the max ahead of time and still have to disengage because it does slow based on the lower max either until it has gone a long way and then very slowly.

I know it doesn't know about yellow signs but just reading the map should tell it something. And what's with not slowing down when I dial down the max? It does do that when the speed limit signs change. Not all that well IMHO but probably not a ticket getter.
 
Forget winter. It doesn't slow down in hairpin curves on good pavement. I've got some around here that drop from 40/45 to 15. There has been a change in FSD. It used to not slow down at all. Now it does in the middle of the curve which isn't the greatest idea traction wise. Maybe the accelerometers force it now. Plus lately I'll dial down the max ahead of time and still have to disengage because it does slow based on the lower max either until it has gone a long way and then very slowly.

I know it doesn't know about yellow signs but just reading the map should tell it something. And what's with not slowing down when I dial down the max? It does do that when the speed limit signs change. Not all that well IMHO but probably not a ticket getter.
What version are you running? Mine used to wait until it was on the curve but with the last coupe updates it’s been better about preemptively slowing down.
 
Yes. I think it was mentioned during AI Day that Tesla has trained NN to extrapolate the shape of intersections and roads based on what vision is able to see. So vision is seeing the two lane lines and a divider and is likely extrapolating that there must be 2 lane lines on the other side.
While I can't say it doesn't extrapolate, I'm fairly confident that it doesn't extrapolate without at least some visibility on the other side of the barrier based on my experience near a similar barrier pretty much daily. I say this because it doesn't consistently draw lanes on the other side for me and this seems to depend on the elevation of the lane I'm in to some degree. Keep in mind that those cameras are a bit higher than most driver's eyes, so seeing some of the road over the barrier isn't necessarily as unlikely as one might imagine.
 
A dangerous behavior discovered on 10.10.2, be careful. This morning on a curvy, hilly and narrow 2 lane residential road on the garbage pick up day, my model Y tried to go around the garbage truck, however once it passed the garbage truck, it did not go back to its lane and it played chicken with the oncoming car, WTF! And it did it again with the second garbage truck!
 
A dangerous behavior discovered on 10.10.2, be careful. This morning on a curvy, hilly and narrow 2 lane residential road on the garbage pick up day, my model Y tried to go around the garbage truck, however once it passed the garbage truck, it did not go back to its lane and it played chicken with the oncoming car, WTF! And it did it again with the second garbage truck!
Yes - I and others have noticed that 10.2 has a tendency to wander to the wrong side of unmarked roads.
 
  • Like
Reactions: diplomat33
A dangerous behavior discovered on 10.10.2, be careful. This morning on a curvy, hilly and narrow 2 lane residential road on the garbage pick up day, my model Y tried to go around the garbage truck, however once it passed the garbage truck, it did not go back to its lane and it played chicken with the oncoming car, WTF! And it did it again with the second garbage truck!
You mean it kept going on the left even with oncoming traffic ?