Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Opinion: what I think Tesla should do to "solve" FSD

This site may earn commission on affiliate links.
First, I want to acknowledge all the hard work of the Tesla FSD team. Tesla has spent years building a sophisticated vision-only system. And the perception part is very advanced. I am not saying that Tesla Vision is perfect. There are still gaps in the perception system. But I feel like Tesla has build a pretty good foundation for FSD. I am not suggesting Tesla start from scratch. On the contrary, I think Tesla should continue to build on that vision-only foundation.

But here are 3 things that I think Tesla should do in order to deploy a more reliable and more robust FSD system.

TLDR Tesla should copy Mobileye.

1) Crowdsourced maps
Tesla has a big fleet of vehicles on the road. They could leverage the vision system in every car on the road to crowdsource detailed maps, similar to what Mobileye is doing. With such a large fleet of vehicles, Tesla could map large areas pretty quickly. Tesla could probably map every road in the US in a relatively short time. And with such a large fleet of vehicles on the road, Tesla could also update the maps pretty easily too since there would always be a Tesla somewhere checking the maps. A lot of the errors that FSD beta makes seem to be due to poor map data. Crowdsourcing could really help solve those issues since there would be a Tesla likely checking that spot pretty regularly. And detailed maps could help make FSD more robust. With crowdsourcing, only the first car would need to drive the road mapless, other cars that encounter the road later, would have the benefit of a map as a prior. And detailed maps can provide useful non-visual info like slowing down for a bend in the road that you can't see because of obstacles or preferred traffic speed different from actual speed limit.

2) Driving Policy
Tesla has done a lot of work with perception but one area where FSD Beta is very weak, is driving policy IMO. For example, FSD beta is poor at knowing when to change lanes when traffic is dense to avoid not missing an exit. FSD beta can wait too long and then miss its chance to merge because of dense traffic. Also, FSD beta can be overly cautious at intersections when there is no traffic at all. FSD beta can also be too shy when going from a stop sign or too aggressive when making unprotected left turns. These are issues that better driving policy would help with. It would improve the driving decisions of the car and make for a safer and smoother ride. Mobileye has a very good RSS policy that helps the car drive safely. So I think Tesla needs to focus more on driving policy. I think FSD Beta would benefit greatly from a driving policy.

3) Sensor redundancy
I think Tesla is smart to focus on vision-only. This is important as a foundation for perception. And I think vision-only will work great for L2 "door to door". So what I am proposing is that Tesla continue to do vision-only for L2 but work on a lidar-radar subsystem that could be added on top of the existing vision-only FSD system to provide extra reliability and redundancy, that could help get the system to "eyes off". This is essentially what Mobileye is doing and I think it is smart. I think vision-only is fine for L2 but having radar-lidar as a back-up is crucial for "eyes off". This because in order to do "eyes off", you really need to be able to trust the system to be super reliable in all conditions. Vision-only cannot do that. With vision-only, if the cameras fail, the entire system will fail or need to pull over. But with cameras, radar and lidar, your system is less likely to fail if the cameras fails. I think having extra sensors as back-up will really help to reach that extra reliability.

46071715365_d36a6e2bf4_b.jpg

"Full Self Driving Tesla" by rulenumberone2 is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Last edited:
I doubt Tesla will go back to radar/lidar. Give all new camera only Model Y owners 1-2 years of free FSD with the caveat they have to opt in to sending data back to Tesla.



This should hopefully provide Tesla enough data to improve FSD drastically.


Tesla can increase the X amount of years camera only Model Y owners get free FSD if more data is needed. The catch is owners have to opt in to data sharing to gain free FSD.
 
  • Informative
Reactions: mspohr
With good enough computer vision you certainly ought not to need anything else. This is still theoretical, though. As it stands now FSD is not even in the ballpark of production grade, and it doesn't appear that it's improved significantly in quite literally years. I do not have it, but base this on things I've read/seen on youtube clips. Tesla saying it will be out of beta this year is not aspirational; it’s simply lying.

It seems to me there is something fundamentally missing in their approach, whether it's their AI or the hardware is just not capable (cameras or the compute power isn't there).
 
I doubt Tesla will go back to radar/lidar. Give all new camera only Model Y owners 1-2 years of free FSD with the caveat they have to opt in to sending data back to Tesla.



This should hopefully provide Tesla enough data to improve FSD drastically.


Tesla can increase the X amount of years camera only Model Y owners get free FSD if more data is needed. The catch is owners have to opt in to data sharing to gain free FSD.

There is no indication that Tesla lacks data to train the neural nets. If anything, they already have more data than they can effectively work with.
 
FSD is not allowed here in the netherlands.. I got it on my model S, (FSD was standard in my model S).. but can't fully use it because of government rules here.. only the Autopilot with navigation/lane change.. inside the city it can't take sharp street corners/roundabout etc.. but it has stupid things, also on highway..

Breaking on and off rapidly if a warning sign flashes on the edge (or above) the road is flashing.. Signs like: about 1000 meters there are traffic lights.. or Busses with high taillights, my Autopilot thinks its a red traffic light.. very stupid for traffic behind when you driving like 60mh. And the Tesla start braking every time the light flash.

Some easy highway corners autopilot can't take, message: take over the steeringwheel..Auto pilot want to change lanes to the left, while the exit is on the right..

It's not slowing down the speed when 'navigate on auto pilot' with sharper highway corners.. the Tesla 'knows' where it drives right? Driving on navigation.. When you approach a sharp turn on highway.. slow down automatically.. please. Even my other 2015 Bmw got this when on cruise control. When there are places on the navigation when it has to slow down.. it goes automatically

A lot of 'easy' problems that can be fixed easily before talking about autonomic driving.. it's far from 'perfect' now these days.
Would like to see a lot of this 'easy' problems need to be fixed.. before talking about complete 'FSD'. But for sure. I like the idea in the future. But every update I hope these stupid 'little' problems are fixed, but till today.. no its not.
 
Tesla has done a lot of work with perception but one area where FSD Beta is very weak, is driving policy IMO

Totally agree, at least in my experience driving policy often falls short in cases where perception is just fine.

Isn't the V12 "end to end" updates their working on going to address this by replacing most / all of the existing driving policy code? Seems like Tesla agrees with your assessment on this.
 
  • Like
Reactions: pilotSteve
For what purpose do you think the new Tesla HD radar modules on the S/X were installed?
I do not keep up with Tesla news. I only found out radar/lidar was removed when researching my purchase of the Model Y. It would be great if lidar/radar came back.


My research indicated it would not be coming back for the Model Y within the next 6 months to a year.


I am not impressed with Tesla Vision in the short time I've had. I would like lidar/radar. I've driven other Model Y's before removal of the hardware for lidar/radar and it was a much better experience.
 
2) Driving Policy
Tesla has done a lot of work with perception but one area where FSD Beta is very weak, is driving policy IMO. For example, FSD beta is poor at knowing when to change lanes when traffic is dense to avoid not missing an exit. FSD beta can wait too long and then miss its chance to merge because of dense traffic. Also, FSD beta can be overly cautious at intersections when there is no traffic at all. FSD beta can also be too shy when going from a stop sign or too aggressive when making unprotected left turns. These are issues that better driving policy would help with. It would improve the driving decisions of the car and make for a safer and smoother ride. Mobileye has a very good RSS policy that helps the car drive safely. So I think Tesla needs to focus more on driving policy. I think FSD Beta would benefit greatly from a driving policy.
I think it is literally just this. Lane planning is at least 70% of what's wrong with FSD beta, and possibly as much as 90%.

The rest of the problems I've seen, with only a few exceptions, involve incorrect lane perception, where it freaks out and beeps about a stopped vehicle that is parked off the left side of the road because it doesn't fully understand that the road turning means that it isn't actually in your driving path, and other similar nonsense. This is stuff that should work every time, but yet fails alarmingly often, particularly on roads that have no lane lines.


With good enough computer vision you certainly ought not to need anything else. This is still theoretical, though. As it stands now FSD is not even in the ballpark of production grade, and it doesn't appear that it's improved significantly in quite literally years. I do not have it, but base this on things I've read/seen on youtube clips.
Three years ago, the FSD beta feature didn't even exist publicly, and the highway driving was ridiculously primitive compared with what FSD beta does, so that's a bit of an exaggeration. At best, it hasn't improved significantly in maybe a year and a half. Before that, the early betas supposedly improved quite a lot.

And honestly, even that would be an exaggeration. The single-stack change significantly reduced navigation mistakes and safety risks when moving between highway and city navigation, and the fact that they moved highway driving over to the same stack without making highway driving dramatically worse is actually quite remarkable. I think most of us cynically expected that it wouldn't reach parity with the legacy stack for six months to a year after release. 😁


Tesla saying it will be out of beta this year is not aspirational; it’s simply lying.

It does seem... incredibly optimistic, but I guess that depends on what you mean by "out of beta". If by "out of beta", you mean declaring that people don't have to opt in, and it is "just on" by default, then yeah, that can absolutely happen by the end of the year. If by "out of beta", you mean level 5 autonomy, that's laughable. My guess is that they mean the former, though, not the latter. I doubt the hands-on-wheel nag will go away before 2025.
 
I'd like to see the proponents of RADAR try driving using a RADAR screen. I can drive fine using only the visible light spectrum. An AI will eventually be able to do the same. Too much data to sift through is a bad thing. Although Infrared vision would be great to have at times.
I don't need an AI to drive like me. I want it to drive better. It should have all the data and sensors it needs so that it doesn't get blinded by the sun like I do, has proper depth perception, better visibility in the rain, etc, etc. Justifying vision only driving by comparing it to a human is silly in my opinion. The goal is the best AI driver ... the approach to this should not be dictated solely by how humans drive.
 
It does seem... incredibly optimistic, but I guess that depends on what you mean by "out of beta". If by "out of beta", you mean declaring that people don't have to opt in, and it is "just on" by default, then yeah, that can absolutely happen by the end of the year. If by "out of beta", you mean level 5 autonomy, that's laughable. My guess is that they mean the former, though, not the latter. I doubt the hands-on-wheel nag will go away before 2025.
Musk last month:

"In terms of where Tesla is at this stage, I think we are very close to achieving full self-driving without human supervision. This is only speculation, but I think we'll achieve full self-driving, maybe what you would call four or five, I think later this year."

They are not close at all. There's no way in the world that's happening this year, and Musk must know this--I'm sure the people on his team know it.

My earlier comment about the multiple years here is that in the fall of 2016 musk said this:

“Our goal is, and I feel pretty good about this goal, that we’ll be able to do a demonstration drive of full autonomy all the way from LA to New York, from home in LA to let’s say dropping you off in Time Square in New York, and then having the car go park itself, by the end of next year,” he said on a press call today. “Without the need for a single touch, including the charger.”

That was six years ago. Are they actually closer? This is why I wonder if there is something systemically wrong with their approach. Level 4 or 5 is the holy grail of tech for our generation (I really believe this), and is probably much harder than they ever imagined it would be.
 
Musk last month:

"In terms of where Tesla is at this stage, I think we are very close to achieving full self-driving without human supervision. This is only speculation, but I think we'll achieve full self-driving, maybe what you would call four or five, I think later this year."

They are not close at all. There's no way in the world that's happening this year, and Musk must know this--I'm sure the people on his team know it.

My earlier comment about the multiple years here is that in the fall of 2016 musk said this:

“Our goal is, and I feel pretty good about this goal, that we’ll be able to do a demonstration drive of full autonomy all the way from LA to New York, from home in LA to let’s say dropping you off in Time Square in New York, and then having the car go park itself, by the end of next year,” he said on a press call today. “Without the need for a single touch, including the charger.”

That was six years ago. Are they actually closer? This is why I wonder if there is something systemically wrong with their approach. Level 4 or 5 is the holy grail of tech for our generation (I really believe this), and is probably much harder than they ever imagined it would be.
Last year Elon predicted FSD by end of year. Same this year.
They just did a new rewrite of the software. I think they're lost.
 
Musk last month:

"In terms of where Tesla is at this stage, I think we are very close to achieving full self-driving without human supervision. This is only speculation, but I think we'll achieve full self-driving, maybe what you would call four or five, I think later this year."

They are not close at all. There's no way in the world that's happening this year, and Musk must know this--I'm sure the people on his team know it.

My earlier comment about the multiple years here is that in the fall of 2016 musk said this:

“Our goal is, and I feel pretty good about this goal, that we’ll be able to do a demonstration drive of full autonomy all the way from LA to New York, from home in LA to let’s say dropping you off in Time Square in New York, and then having the car go park itself, by the end of next year,” he said on a press call today. “Without the need for a single touch, including the charger.”

That was six years ago. Are they actually closer? This is why I wonder if there is something systemically wrong with their approach. Level 4 or 5 is the holy grail of tech for our generation (I really believe this), and is probably much harder than they ever imagined it would be.
The difference from what I can tell is "imitation learning" ... just throwing vast amount of driving data at an AI ... vs "world modeling" ... building a generalized idea of how the world works. The latter is what is required to deal with novel and/or unexpected situations.

Everything is getting better over time due to advances in research and hardware ... so that's changing and will continue to improve. But training purely on driving data might be a dead end approach, if you think about it. The step function of inputs and training improvements required to build a generalized world model are just going to be ... different.

 
Musk last month:

"In terms of where Tesla is at this stage, I think we are very close to achieving full self-driving without human supervision. This is only speculation, but I think we'll achieve full self-driving, maybe what you would call four or five, I think later this year."

They are not close at all. There's no way in the world that's happening this year, and Musk must know this--I'm sure the people on his team know it.

My earlier comment about the multiple years here is that in the fall of 2016 musk said this:

“Our goal is, and I feel pretty good about this goal, that we’ll be able to do a demonstration drive of full autonomy all the way from LA to New York, from home in LA to let’s say dropping you off in Time Square in New York, and then having the car go park itself, by the end of next year,” he said on a press call today. “Without the need for a single touch, including the charger.”

That was six years ago. Are they actually closer? This is why I wonder if there is something systemically wrong with their approach. Level 4 or 5 is the holy grail of tech for our generation (I really believe this), and is probably much harder than they ever imagined it would be.
They tried one approach and when they realized that wasn't going to work he's been bluffing his way through as they've rewritten it.
And of course he doesn't know whether it will work.
It's similar to the Model X production debacle, but without knowing whether it's a solvable problem.

The only thing Tesla has going for it at this point is that nobody know how close anybody is. If it were a solved problem companies would be scaling rapidly.