Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Opinion: what I think Tesla should do to "solve" FSD

This site may earn commission on affiliate links.
First, I want to acknowledge all the hard work of the Tesla FSD team. Tesla has spent years building a sophisticated vision-only system. And the perception part is very advanced. I am not saying that Tesla Vision is perfect. There are still gaps in the perception system. But I feel like Tesla has build a pretty good foundation for FSD. I am not suggesting Tesla start from scratch. On the contrary, I think Tesla should continue to build on that vision-only foundation.

But here are 3 things that I think Tesla should do in order to deploy a more reliable and more robust FSD system.

TLDR Tesla should copy Mobileye.

1) Crowdsourced maps
Tesla has a big fleet of vehicles on the road. They could leverage the vision system in every car on the road to crowdsource detailed maps, similar to what Mobileye is doing. With such a large fleet of vehicles, Tesla could map large areas pretty quickly. Tesla could probably map every road in the US in a relatively short time. And with such a large fleet of vehicles on the road, Tesla could also update the maps pretty easily too since there would always be a Tesla somewhere checking the maps. A lot of the errors that FSD beta makes seem to be due to poor map data. Crowdsourcing could really help solve those issues since there would be a Tesla likely checking that spot pretty regularly. And detailed maps could help make FSD more robust. With crowdsourcing, only the first car would need to drive the road mapless, other cars that encounter the road later, would have the benefit of a map as a prior. And detailed maps can provide useful non-visual info like slowing down for a bend in the road that you can't see because of obstacles or preferred traffic speed different from actual speed limit.

2) Driving Policy
Tesla has done a lot of work with perception but one area where FSD Beta is very weak, is driving policy IMO. For example, FSD beta is poor at knowing when to change lanes when traffic is dense to avoid not missing an exit. FSD beta can wait too long and then miss its chance to merge because of dense traffic. Also, FSD beta can be overly cautious at intersections when there is no traffic at all. FSD beta can also be too shy when going from a stop sign or too aggressive when making unprotected left turns. These are issues that better driving policy would help with. It would improve the driving decisions of the car and make for a safer and smoother ride. Mobileye has a very good RSS policy that helps the car drive safely. So I think Tesla needs to focus more on driving policy. I think FSD Beta would benefit greatly from a driving policy.

3) Sensor redundancy
I think Tesla is smart to focus on vision-only. This is important as a foundation for perception. And I think vision-only will work great for L2 "door to door". So what I am proposing is that Tesla continue to do vision-only for L2 but work on a lidar-radar subsystem that could be added on top of the existing vision-only FSD system to provide extra reliability and redundancy, that could help get the system to "eyes off". This is essentially what Mobileye is doing and I think it is smart. I think vision-only is fine for L2 but having radar-lidar as a back-up is crucial for "eyes off". This because in order to do "eyes off", you really need to be able to trust the system to be super reliable in all conditions. Vision-only cannot do that. With vision-only, if the cameras fail, the entire system will fail or need to pull over. But with cameras, radar and lidar, your system is less likely to fail if the cameras fails. I think having extra sensors as back-up will really help to reach that extra reliability.

46071715365_d36a6e2bf4_b.jpg

"Full Self Driving Tesla" by rulenumberone2 is licensed under CC BY 2.0.
Admin note: Image added for Blog Feed thumbnail
 
Last edited:
I don't agree with the need for radar.

While it may be useful for redundancy, but when all systems work, how can we safely combine the data?
For example, a low-res radar w/Kalman filter can easily erase things like protruding metal bars from concrete paving. Is it safe to put equal weights between vision and radar data?
Radar noise also dramatically increases in the rain, and I'm not even talking about other weather phenomena like wet snow (rarely observed in Israel) rendering all kinds of *dars useless until cleaned.
 
  • Like
  • Disagree
Reactions: mborkow and mspohr
I think people are looking for a backup for vision with radar. The way I see it, vision runs everything. However, when vision sees something it thinks requires a deceleration, it queries radar to confirm. People hope this will reduce the number of phantom braking events with the car, such as mirages.
 
  • Like
Reactions: pilotSteve
While it may be useful for redundancy, but when all systems work, how can we safely combine the data?

Combining the data, called sensor fusion, is solved at this point. AV companies use cameras, radar and lidar and figured out how to fuse all that data.

For example, a low-res radar w/Kalman filter can easily erase things like protruding metal bars from concrete paving. Is it safe to put equal weights between vision and radar data?

You cannot use low res radar. That does not work well. You need to use high def radar. Companies use high def radar which works great.

Radar noise also dramatically increases in the rain, and I'm not even talking about other weather phenomena like wet snow (rarely observed in Israel) rendering all kinds of *dars useless until cleaned.

No, this is false. Radar noise does not dramatically increase in rain or snow. Quite the opposite. Radar is not affected by rain, snow or fog because it operates at a different wavelength than visible light. Good radar will see through rain, snow and fog without any loss. That's the whole point of using radar because it can see through rain, snow and fog when cameras will be degraded.

It's cameras that suffer greater noise in rain, snow and fog as the visible light is diffracted by the water in the air. We've all experienced the loss of visibility when driving in rain, snow or fog.

And good AVs will have self-cleaning sensors to prevent noise from dirty or occluded sensors.
 
Last edited:
Combining the data, called sensor fusion, is solved at this point. AV companies use cameras, radar and lidar and figured out how to fuse all that data.



You cannot use low res radar. That does not work well. You need to use high def radar. Companies use high def radar which works great.



No, this is false. Radar noise does not dramatically increase in rain or snow. Quite the opposite. Radar is not affected by rain, snow or fog because it operates at a different wavelength than visible light. Good radar will see through rain, snow and fog without any loss. That's the whole point of using radar because it can see through rain, snow and fog when cameras can't.

It's cameras that suffer greater noise in rain, snow and fog as the visible light is diffracted by the water in the air. We've all experienced the loss of visibility when driving in rain, snow or fog.

And good AVs will have self-cleaning sensors to prevent noise from dirty or occluded sensors.
Weather radar shows rain and snow. It doesn't pass through.
 
Weather radar shows rain and snow. It doesn't pass through.

That's a different type of radar. There are different kinds of radar that operate at different wavelengths. The radar used in cars is different than the radar used in weather detection. Weather radar operates a wavelength that reflects off of rain and snow in order to detect it. Radar used in cars operates at a different wavelength that passes through rain and snow since cars need to be able to see through it.
 
The radar backup seems to me to be the same trolley problem. A predefined confidence threshold, or a regression of some ambience factors, will be needed to confirm via radar. This may involve significant engineering effort. And additional hardware. Phantom braking seems to be an okay compromise vs. tens of thousands of dollars in additional costs.
 
I think people are looking for a backup for vision with radar. The way I see it, vision runs everything. However, when vision sees something it thinks requires a deceleration, it queries radar to confirm. People hope this will reduce the number of phantom braking events with the car, such as mirages.
I personally would like to see this being implemented, assuming the car has enough bandwidth to handle this quickly enough.

On the other hand, if the reaction speed becomes even slower than now (i.e. later braking, etc), then probably not.
 
  • Like
Reactions: pilotSteve and Dewg
The radar backup seems to me to be the same trolley problem. A predefined confidence threshold, or a regression of some ambience factors, will be needed to confirm via radar. This may involve significant engineering effort. And additional hardware. Phantom braking seems to be an okay compromise vs. tens of thousands of dollars in additional costs.

Lots of AV companies use cameras, radar and lidar and have no issues with sensor fusion. Adding radar makes their perception much more reliable and reduces phantom braking.

Phantom braking is not an acceptable compromise when it can mean causing an accident. And adding radar does not cost tens of thousands of dollars. Not sure where you are getting that number from.
 
I admit I'm ignorant of high-def radar and lidar when it comes to weather, so I'm asking for some clarification, @diplomat33

I accept your premise that modern radar and lidar can see through rain, snow, and fog where cameras cannot. However, how does that help AVs that need to see traffic lights, street signs, and lane markings? If the fog is dense enough that cameras cannot see more than a few dozen feet, can the car keep driving with radar and lidar? My understanding of how systems operate today would indicate that the car cannot continue driving in those conditions for fear of breaking the law (running a red light, failing to observe a speed limit sign, failing to observe a detour or temporary construction zone sign, etc.

Radar is not affected by rain, snow or fog because it operates at a different wavelength than visible light. Good radar will see through rain, snow and fog without any loss. That's the whole point of using radar because it can see through rain, snow and fog when cameras can't.
 
That's a different type of radar. There are different kinds of radar that operate at different wavelengths. The radar used in cars is different than the radar used in weather detection. Weather radar operates a wavelength that reflects off of rain and snow in order to detect it. Radar used in cars operates at a different wavelength that passes through rain and snow since cars need to be able to see through it.
Volvo says rain and snow block radar and can make ACC unreliable.
 
I accept your premise that modern radar and lidar can see through rain, snow, and fog where cameras cannot.

I said radar can, I did not say lidar can. Lidar will have some degradation in rain, snow and fog.

However, how does that help AVs that need to see traffic lights, street signs, and lane markings? If the fog is dense enough that cameras cannot see more than a few dozen feet, can the car keep driving with radar and lidar?
My understanding of how systems operate today would indicate that the car cannot continue driving in those conditions for fear of breaking the law (running a red light, failing to observe a speed limit sign, failing to observe a detour or temporary construction zone sign, etc.

If cameras cannot see more than a few dozen feet then visibility is so low, the AV should pull over. Obviously, visibility can be so bad that no sensor will work and you need to pull over. We are talking about conditions that are still drivable with cameras but where radar will help reduce failures.

As I explained before, the purpose of radar and lidar is not to drive instead of cameras but to reduce failure rates. You always need cameras.

 
I said radar can, I did not say lidar can. Lidar will have some degradation in rain, snow and fog.




If cameras cannot see more than a few dozen feet then visibility is so low, the AV should pull over. Obviously, visibility can be so bad that no sensor will work and you need to pull over. We are talking about conditions that are still drivable with cameras but where radar will help reduce failures.

As I explained before, the purpose of radar and lidar is not to drive instead of cameras but to reduce failure rates. You always need cameras.
Thanks for clarifying - your comment was "That's the whole point of using radar because it can see through rain, snow and fog when cameras can't.", which I think was confusing for some, because if the cameras can't see then it can't safely drive.
 
Thanks for clarifying - your comment was "That's the whole point of using radar because it can see through rain, snow and fog when cameras can't.", which I think was confusing for some, because if the cameras can't see then it can't safely drive.

That was a poor choice of words on my part. I meant that cameras will be degraded in rain, snow and fog where radar will not. I edited my comment to clarify.
 
  • Like
Reactions: Dewg
Combining the data, called sensor fusion, is solved at this point. AV companies use cameras, radar and lidar and figured out how to fuse all that data.



You cannot use low res radar. That does not work well. You need to use high def radar. Companies use high def radar which works great.



No, this is false. Radar noise does not dramatically increase in rain or snow. Quite the opposite. Radar is not affected by rain, snow or fog because it operates at a different wavelength than visible light. Good radar will see through rain, snow and fog without any loss. That's the whole point of using radar because it can see through rain, snow and fog when cameras can't.

It's cameras that suffer greater noise in rain, snow and fog as the visible light is diffracted by the water in the air. We've all experienced the loss of visibility when driving in rain, snow or fog.

And good AVs will have self-cleaning sensors to prevent noise from dirty or occluded sensors.
I totally think some AV companies have reached higher levels of autonomy than Tesla, especially in good weather spots.
But does anyone have an affordable solution?
Should autonomous cars that cost twice as much as a good helicopter?
 
But does anyone have an affordable solution?

Nobody has affordable L4 yet. But Mobileye thinks they will. They say their chauffeur product will cost less than $6,000 by 2025.

b1OiMxk.png



Should autonomous cars that cost twice as much as a good helicopter?

You are probably thinking of Waymo robotaxis. But Waymo is focused on robotaxis that don't have to be affordable for the consumer to buy. But other companies like Mobileye are focused on consumer cars. They are trying to develop self-driving for consumer cars that will be affordable.

The bottom line is that nobody has a system that is both affordable and true full self-driving (driverless) and safe. Some companies like Waymo have made great progress towards the driverless and safe part but it is not affordable for the consumer to buy. Others like Tesla have the affordable part but it is not true full self-driving yet.
 
  • Informative
Reactions: pilotSteve
I don't agree with the need for radar.

While it may be useful for redundancy, but when all systems work, how can we safely combine the data?
For example, a low-res radar w/Kalman filter can easily erase things like protruding metal bars from concrete paving. Is it safe to put equal weights between vision and radar data?
Radar noise also dramatically increases in the rain, and I'm not even talking about other weather phenomena like wet snow (rarely observed in Israel) rendering all kinds of *dars useless until cleaned.
This is running on an operating system (I hope) that can run multiple tasks and can issue interrupts if some event occurs
 
They tried one approach and when they realized that wasn't going to work he's been bluffing his way through as they've rewritten it.
And of course he doesn't know whether it will work.
It's similar to the Model X production debacle, but without knowing whether it's a solvable problem.

The only thing Tesla has going for it at this point is that nobody know how close anybody is. If it were a solved problem companies would be scaling rapidly.
Solvable problem is key (like is it achievable nowish on typical roads that are not hyper-curated).

If you took an average 16 year old child, gave them two days of instruction, and told them to drive around NYC until they have a crash, and FSD to do the same, the kid would last longer. Obviously, I cannot confirm this, but it underlines the problem with "millions of miles" of training data. That also underlines how frustrating it must be to work on this. It's something that a child can be taught--like all of them; even a very average driver is better than FSD.

Think about a two way street with a truck is broken down blocking a lane, and somebody takes it upon themselves, 100' feet away, to be traffic cop. With just their hand and a look on their face you know they are saying it's safe to go because the way is clear. It's very hard to imagine how an FSD system can get to that point. It may be that AI is simply too immature at this stage in history to do it.

I remain optimistic passenger vehicles will have this service in the future, though. Probably still many years off. I'd have absolutely no qualms betting money on the fact HW3 cars will never, ever have a mature level 5 autonomy.
 
That's a different type of radar. There are different kinds of radar that operate at different wavelengths. The radar used in cars is different than the radar used in weather detection. Weather radar operates a wavelength that reflects off of rain and snow in order to detect it. Radar used in cars operates at a different wavelength that passes through rain and snow since cars need to be able to see through it.

Automotive RADAR absolutely gets reflections from rain. The noise from the rain itself, if necessary, can be filtered out by a clutter suppression algorithm (*), though I'm not sure if they actually bother. Also water sitting on the bumper causes attenuation, and snow or ice on the bumper can cause enough attenuation for some automotive ADAS systems to refuse to drive at all.

RADAR can potentially handle heavy rain better than vision, of course, but there are fundamental limitations that preclude driving without usable vision, including being unable to read signs, determine traffic light colors, etc., so it is an open question whether adding high-definition RADAR would make enough difference to matter, except perhaps in terms of slightly improving safety in the first few seconds after rain suddenly starts pouring hard enough that the vehicle has to pull over to the side of the road.

* Decluttering algorithms also exist for LIDAR that can dramatically improve its performance in rainy conditions.

The main reason weather RADAR detects rain as well as it does is because it is aimed at nothing. Everything (other than ground clutter, which is removed algorithmically) that reflects is weather, so when you crank the gain up, whatever signal you get is weather. When you're pointing a low-gain RADAR emitter and receiver at solid objects, the signal from cars is huge and the signal from rain is small by comparison.
 
Last edited:
  • Helpful
Reactions: diplomat33