Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Profound progress towards FSD

This site may earn commission on affiliate links.
I wonder why Tesla didn't release software that was shown and demoed on Autonomy day last year. Clearly it was "feature complete" with its ability to autosteer on city streets with lane changes and sharp turns. Elon Musk most recent estimate of 2-4 months for the rewrite was also qualified with "Then it's a question of what functionality is proven safe enough to enable for owners." https://twitter.com/elonmusk/status/1278539278356791298

So perhaps Tesla knew the intersections near Palo Alto HQ were "safe enough" while others even just down the street could be quite dangerous, e.g., turning into oncoming traffic. But even then what's the threshold of releasing wider as theoretically gathering failure examples from the fleet even in shadow mode should help improve training unless Karpathy realized squeezing out more functionality based on 2.5D Autopilot would take more work for less benefit?

No it doesn't mean palo alto road were safer. Demo was strictly demo. Its too much of liability for company to wide release of level 3 like features.
 
Green found a case where the nav directions are wrong:
"Tesla Nav at its finest. Yes it does tell me to first turn right from two left most lanes (and got intersection wrong) and then tells me to turn right when I need to turn left."
https://twitter.com/greentheonly/status/1290164321180241920

So would a Tesla robotaxi try to follow the bad nav directions or would it know the correct turn to make based only on vision? Because if a Tesla robotaxi tries to make a left instead of a right because the nav directions are wrong, that would be a problem.
 
So would a Tesla robotaxi try to follow the bad nav directions or would it know the correct turn to make based only on vision? Because if a Tesla robotaxi tries to make a left instead of a right because the nav directions are wrong, that would be a problem.

That looks like a graphical glitch, more than anything. Notice how the blue route line on the map is always correct, regardless of the directions.
 
Personally I have not experienced any "profound" progress towards FSD. What I've seen is some small incremental improvements over the last year after NoA was released.

First, it was the FSD computer that was supposed to change everything, then it was the autopilot rewrite, now it's some 4D perception BS.

I still bought FSD since I do enjoy the incremental improvements but expectations are way overblown by both Tesla themselves and the Tesla simps.

So much hype and all we got was a staged demo around Palo Alto.
 
Green found a case where the nav directions are wrong:
"Tesla Nav at its finest. Yes it does tell me to first turn right from two left most lanes (and got intersection wrong) and then tells me to turn right when I need to turn left."
https://twitter.com/greentheonly/status/1290164321180241920

So would a Tesla robotaxi try to follow the bad nav directions or would it know the correct turn to make based only on vision? Because if a Tesla robotaxi tries to make a left instead of a right because the nav directions are wrong, that would be a problem.

There's a freeway exit near me where I always make a right turn, and the nav always says "take the XYZ exit and make a sharp left" for whatever reason (there is no "sharp left" option, even if I was going left). Of course it corrects itself as soon as you pass the "sharp left" point... the route is always drawn correctly though. Seems like wonky nav instructions from some bad metadata or something like that.
 
  • Like
Reactions: diplomat33
I gotta say that with update 2020.24.6.9, my car in FSD on a city street didn't notice a white semi with no painted signs on the truck or on the trailer that came out of a parking lot perpendicular to the traffic flow. I had to apply the brakes myself. The truck didn't show up on my center console. How annoying.
 
  • Like
  • Funny
Reactions: DanCar and cucubits
I gotta say that with update 2020.24.6.9, my car in FSD on a city street didn't notice a white semi with no painted signs on the truck or on the trailer that came out of a parking lot perpendicular to the traffic flow. I had to apply the brakes myself. The truck didn't show up on my center console. How annoying.

Why did you think FSD would handle this?
 
Why did you think FSD would handle this?
I thought FSD would handle it because there was a fatality that made the national news of just this kind of thing over a year ago, and recently, there was a Tesla that crashed into an overturned trailer in the left lane of a limited-access highway that got lots of views on YouTube. I thought the Tesla neural network was supposed to learn from experience. Apparently that's too much to expect.
 
Unfortunately, it's still unclear whether neural networks will ever be good enough for level 4.5 or 5

I hope it's just a matter of the right set of sensors with a well-trained neural network. Because if we cannot use machine learning to achieve level 5, it would take years for programmers to code a deterministic alternative, and it would end up being such an unwieldy codebase to maintain and update.
 
That's not how neural networks work, and that's also not how Tesla approaches the FSD problem. Unfortunately, it's still unclear whether neural networks will ever be good enough for level 4.5 or 5.
I'm not a computer guy. I'm just a very old guy. But if I were a computer programmer, and if I were designing something that deserved the name, neural network, I'd design it so that if a particular picture appears, in this case, a solid object on the road in the travel lane that's not moving with traffic, and the system notes people taking over or crashing, I'd program the neural network to learn from that and stop soon enough to not hit the truck.
 
I'm not a computer guy. I'm just a very old guy. But if I were a computer programmer, and if I were designing something that deserved the name, neural network, I'd design it so that if a particular picture appears, in this case, a solid object on the road in the travel lane that's not moving with traffic, and the system notes people taking over or crashing, I'd program the neural network to learn from that and stop soon enough to not hit the truck.
I understand that the neural network has to distinguish between truck effectively stopped in the travel lane and a shadow across the road or a change in the color of the pavement. If the sensors can't distinguish between a three dimensional object and a change in the color of the pavement, then it's not neural network problem it's a sensor problem. Maybe Elon is wrong. Maybe you do need LIDAR in order to have FSD.
 
  • Like
Reactions: diplomat33
Personally I have not experienced any "profound" progress towards FSD. What I've seen is some small incremental improvements over the last year after NoA was released.

First, it was the FSD computer that was supposed to change everything, then it was the autopilot rewrite, now it's some 4D perception BS.

I still bought FSD since I do enjoy the incremental improvements but expectations are way overblown by both Tesla themselves and the Tesla simps.

So much hype and all we got was a staged demo around Palo Alto.

This is pretty accurate. I’ll go along for the ride (pun intended), and didn’t mind paying for FSD. It’s fun to see what future updates may or may not bring. To me, FSD is just an option box to check when ordering a car. The 2020 Audi A6 Allroad I “built” had the $4,900 Bang&Oulfsen stereo and the $1,800 all wheel steering. Those seem like a real waste of money for what you get. At least AP/FSD does “something”, regardless of how much it may or may not suck. Not sympathizing, just the way I see things.
 
programmers to code a deterministic alternative

Human-maintained heuristics for FSD will be impossible to manage. We just don't have enough individual brainpower or organizational structures to manage that kind of complexity and constant change. It's essentially what Karpathy has been saying regarding HD maps and baked-in heuristics.
 
Last edited:
I hope it's just a matter of the right set of sensors with a well-trained neural network.

It is a matter of the right set of sensors and a well-trained NN. That's why companies like Waymo are using cameras, lidar, radar and well trained NN. Now, I am not saying that they have perfect FSD yet. They don't. But they do have accurate and reliable perception. They've certainly solved problems like a truck turning perpendicular in front of the car. And with accurate perception, they've been able to focus on more difficult FSD problems.
 
I understand that the neural network has to distinguish between truck effectively stopped in the travel lane and a shadow across the road or a change in the color of the pavement. If the sensors can't distinguish between a three dimensional object and a change in the color of the pavement, then it's not neural network problem it's a sensor problem. Maybe Elon is wrong. Maybe you do need LIDAR in order to have FSD.

I think this is a good example where lidar is very helpful, even essential. If there is a real object, the lasers from the lidar will reflect off of it, but if it's just a shadow, there won't be any reflections. So Lidar won't get confused between a real object and a shadow. Lidar will give you accurate object detection where camera vision might get confused.
 
I think this is a good example where lidar is very helpful, even essential. If there is a real object, the lasers from the lidar will reflect off of it, but if it's just a shadow, there won't be any reflections. So Lidar won't get confused between a real object and a shadow. Lidar will give you accurate object detection where camera vision might get confused.
But there's also radar in front of the Tesla. That should be able to distinguish between a stopped object not in the travel lane from a stopped object in the travel lane. If it can't then a higher definition radar should be installed. This is a "can't do without" issue. Another issue that a self-driving car can't do without is the ability to see deer and other animals in or near the travel lane. A human, seeing a deer on the side of the road, or crossing the road, knows there may be other deer nearby and slows down. FSD needs to do the same. It has to realize that a deer standing perfectly still might suddenly jump in front of the car. Or one of the members of its family, which might not be visible, might suddenly jump in front of the car.

Another FSD issue is potholes. Right now, I understand that FSD can't distinguish between an actual pothole and a patched pothole so it ignores potholes. FSD is going to have to be able to make that distinction.