Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Sensor Suite vs. LIDAR

This site may earn commission on affiliate links.
I think the issue is: Should Tesla add an additional sensors like LIDAR to solve some of the current limitations.

Elons point is that if they can make Radar work then there's no need for LIDAR. LIDAR works great... if visibility is good. Radar needs some improvement but from a physics perspective if they can bring it up to LIDAR level then radar will work ALWAYS and there will be no need for LIDAR.
 
Elons point is that if they can make Radar work then there's no need for LIDAR. LIDAR works great... if visibility is good. Radar needs some improvement but from a physics perspective if they can bring it up to LIDAR level then radar will work ALWAYS and there will be no need for LIDAR.


Before the Florida accident, Tesla believes LIDAR is "unnecessary."

After it, Tesla believes it can use radar with "temporal smoothing to create a coarse point cloud, like lidar."

That sounds like Tesla is no longer dismissing what the gold standard of LIDAR can produce but rather it now wants to achieve that standard, but not by LIDAR physically itself but by tweaking the existing radar to emulate LIDAR.
 
Last edited:
The Tesla RADAR system uses a wavelength of 3.9 mm. While that is a lot longer than the wavelength of light, it's still pretty small and I should be completely adequate enough for autonomous driving applications. As for LIDAR, I can't always wait for it to stop raining
 
I really do not have much to add here apart from one rather simple and perhaps naive observation and that is that humans having been driving for a very long time using only stereographic vision. It would seem by example that this is all that is required.

I have a friend who lost an eye in their mid 20's. They still drive.
Cheaper Japanese brands uses stereographics for cruise control (Suzuki and Subaru come to mind)

Radar, Lidar, camera, sonics, all have pros, and cons.

The bigger issue, how to deal with false positives when one system detects, and another doesn't

currently false positives are ignored and false negatives are ignored. adding Lidar to Radar and camera, would allow a 2 out of 3 scenario, which in theory should be better, given equally good programming
 
It is clear that LIDAR proponents are not anti-RADAR.

It has never about let's take away RADAR from current configuration.

They want to use as many different kinds of sensors as possible so that one would complement each other's weakness.

I think the issue is: Should Tesla add an additional sensors like LIDAR to solve some of the current limitations.
The point is: is useless having more sensors if they don't have a purpose.
If you can navigate just with radar+camera, why bother with lidar?
If just one function of the car is linked, in any way, to the lidar, and the lidar isn't working properly ( fog, rain, snow.. ) then you have false-positive or some part of the system simply stop working.. so it's not a reliable system.
You can't have a system that work sometime and sometime no, it's just can't be done.

If you say "use the lidar when it's night and the camera can't work" than, ok, you have a solution that cover the deficit of another, but then, if it's night and it's raining, then you are no better than before, so again, it serve no purpose.
 
I have a friend who lost an eye in their mid 20's. They still drive.
Cheaper Japanese brands uses stereographics for cruise control (Suzuki and Subaru come to mind)

Radar, Lidar, camera, sonics, all have pros, and cons.

The bigger issue, how to deal with false positives when one system detects, and another doesn't

currently false positives are ignored and false negatives are ignored. adding Lidar to Radar and camera, would allow a 2 out of 3 scenario, which in theory should be better, given equally good programming
Currently we are a level 3, in a level 4 you can't ignore false positive.
In a level 3 you can, since the driver is the "net safety system", in a level 4 you don't have the net, so you can't discard information.

You can't have a unreliable sensor, since you can't drop the info it give you.

Make that the lidar see what a camera can't ( make it night and poor illumination ) but you think it could be a false positive since the camera and radar didn't see it, so you discard the data.. so what the point? you have a good sensor, that gave you good info, but you can't use it because it's unreliable..

Put the other case, you choose to do 2 ot of 3? ok, let's say that the radar has some false-positive and it's raining, so the lidar also has false-positive ( and this is sure. it will. it will not be a strange-coincidence ), so you act, but it was a false positive, then you are in the *sugar*. What good had lidar done to you? nothing.

You simply can't have an unreliable source, if you just say "ok, but the radar can have a false-positive", then you are wrong.
If a sensor can have false-positive, then it's less likely but even 10 sensor can have false-positive at the same time, using more sensor doesn't solve the problem, it only reduce the probability but it cost the double.

The good (and only) way is to solve the problem so you don't have false-positive.
 
The point is: is useless having more sensors if they don't have a purpose

That's Tesla position prior to Florida crash as it said LIDAR was "unnecessary."


...If you can navigate just with radar+camera, why bother with lidar?....

Apparenty, current Tesla system doesn't navigate well without LIDAR in Florida accident case. That's why there's a Tesla tweet inspired to be LIDAR-like to adjust the current RADAR system to emulate LIDAR.

I don't think that tweet was a result of a bad nighttime weather accident but because of a certain accident in a very good daytime weather.

I don't know about this coming UBER disclosure but Google with its LIDAR has been an open book and it has been disclosing all accidents and LIDAR is still a gold standard.

For a bad rain or snow day, Ford released a video of its LIDAR in action:



For good night driving with no lights in a winding road:

https://i.kinja-img.com/gawker-media/image/upload/s--mdcdrUtt-
lltpotdhanng2oipd6ul.gif
 
Last edited:
Apparenty, current Tesla system doesn't navigate well without LIDAR in Florida accident case. That's why there's a Tesla tweet inspired to be LIDAR-like to adjust the current RADAR system to emulate LIDAR.
Again.. no, this doesn't prove nothing, just because it failed doesn't means that radar isn't good enought.
It could be ( and probably is ) that the software used for the radar isn't good enought and can't discriminate, so the idea of a the improvement on the radar, the question was: is this a truck or a road signal?
If you can 'storicize' the data you would know that a road sign can't 'appear' from nothing, in this case you can conclude it's not a flase-positive.
But, again, even if the radar can't discriminate it from a road sign the camera can.
The problem with the camera is that'a *sugar* of a camera probably because mobileye can't afford to elaborate the ouput from a better camera, but the point is: while a radar in case of suspended object can have problem detecting it, a camera doesn't have this problem and it should see it.

Or would you say that a NORMAL camera can't see a truck? this would be really silly..
 
....but the point is: while a radar in case of suspended object can have problem detecting it, a camera doesn't have this problem and it should see it.

Or would you say that a NORMAL camera can't see a truck? this would be really silly..

Put it in another way, current Tesla camera system might be unable to dectect obstacles in some lighting conditions due to its mono-chrome hardware. I don't think Tesla has publiclly suggested switching to a color camera but from the users' discusstion, to solve this problem, it is reasonable to believe that Tesla will in future.

Even with all color camera, it is questionable that it can detect obstacles with "Bright light (oncoming headlights or direct sunlight) is interfering with the camera's view."

Again.. no, this doesn't prove nothing, just because it failed doesn't means that radar isn't good enought...

It could be ( and probably is ) that the software used for the radar isn't good enought and can't discriminate, so the idea of a the improvement on the radar, the question was: is this a truck or a road signal?..

To solve the problem of radar detecting a tractor-trailer but "Radar tunes out what looks like an overhead road sign to avoid false braking events" Tesla will do "temporal smoothing to create a coarse point cloud, like lidar."

Thus, in summary, Tesla said LIDAR was "uncessary."
Now, Tesla will make RADAR to work as if it's a LIDAR.

On the other camp, LIDAR group says: Just add it on to your configuration because technologically (hardware and software standpoints), LIDAR has been readily available to solve the Florida scenario.
 
Last edited:
This is not my area of expertise, but we had someone who knows a ton about the tech write an article on this:

Tesla & Google Disagree About LIDAR — Which Is Right?


Thanks for the great article that says all sensors have their pluses and minuses.

It seems to conclude in the end that for now, Tesla is correct about skipping LIDAR until there's a breakthrough in LIDAR.

The conclusion might be misinterpreted that LIDAR is technologically unable to solve the current Florida challenge until there's a breakthrough.

"For now, it appears as if Tesla is correct. With the advent of solid-state LIDAR at steadily decreasing price points, and with better performance characteristics, that may change soon. [Editor’s Note: I assume developers of any breakthrough LIDAR system would be extremely eager to have Tesla as a client and would be pushing for trial use of their tech, which I imagine Tesla would happy to implement. In other words, as soon as such potentially breakthrough LIDAR is ready for production, I assume Tesla will be one of the quickest (if not the quickest) to be offered it and to actually implement it in production vehicles.]"
 
From what I can see based on how the system performs, the model it builds is instantaneous, that is, once a target leaves it's field of view, it's out of the model. I also think the field of view is closer to 120 degrees or maybe a bit more.

Other than that, I totally agree with your comments.
No, it's definitely closer to 50 degrees (talking about the top picture in page 4, the last picture may be with a wider camera). I'm not sure if you are a camera guy, but that is about the same as a 45 / 50mm lens on 35mm full frame or 30mm lens on APS-C. To get close to 120 degree you would need an ultra wide 14mm for full frame (~115 degree) or 10mm for APS-C (~110 degree).

Sources say the current setup is with a 50 degree camera. Here's an illustration of the rough coverage of the 50 degree camera (versus wider and narrower cameras):
T96_Mobileye_F2v3.png

The Linley Group
 
  • Informative
Reactions: Tam
It seems to me that the real question is the software that puts the sensor information together. If it's really the case that the Tesla's radar and camera systems are independent and can come up with different answers to the question of it's a truck or an overpass, the software is a very long way from optimal.

Lidar systems should have higher angular resolution than radar, but the camera is better than either by a lot. That's why I believe that it's the sensor interpreting software that should be where the effort is put.
 
It seems to me that the real question is the software that puts the sensor information together. If it's really the case that the Tesla's radar and camera systems are independent and can come up with different answers to the question of it's a truck or an overpass, the software is a very long way from optimal.

Lidar systems should have higher angular resolution than radar, but the camera is better than either by a lot. That's why I believe that it's the sensor interpreting software that should be where the effort is put.
That's going to be true of any system that relies on multiple types of sensors. A lot of it falls on the software and processing of that data. In the Florida case, the system erred on the side of interpreting it as a false positive. This is the safer/less annoying approach assuming you have a driver paying attention that can take over (as in a semi-autonomous system).

In a fully autonomous system, this assumption would not be acceptable and Elon touched on that (he mentioned there needed to be a lot more processing power and redundancy).
 
It seems to me that the real question is the software that puts the sensor information together. If it's really the case that the Tesla's radar and camera systems are independent and can come up with different answers to the question of it's a truck or an overpass, the software is a very long way from optimal.

Lidar systems should have higher angular resolution than radar, but the camera is better than either by a lot. That's why I believe that it's the sensor interpreting software that should be where the effort is put.

I also believe that the sensor fusion is the hardest thing.
 
That's going to be true of any system that relies on multiple types of sensors. A lot of it falls on the software and processing of that data. In the Florida case, the system erred on the side of interpreting it as a false positive. This is the safer/less annoying approach assuming you have a driver paying attention that can take over (as in a semi-autonomous system).

In a fully autonomous system, this assumption would not be acceptable and Elon touched on that (he mentioned there needed to be a lot more processing power and redundancy).
Totally agree. There is a real issue IMHO of the validity of the "assuming you have a drive paying attention" part. Is a system that works 99% of the time actually just a partial implementation or an irresponsibly designed trap? I say it's a partial implementation, but I can understand people who view it as a trap.

Personally, having worked with software, computers, and their engineers for decades, I have ZERO faith in any of them. Both my wife (also a software engineer) and I always use AP on every possible occasion and always watch it like hawks.
 
Personally, having worked with software, computers, and their engineers for decades, I have ZERO faith in any of them.

This is where machine learning becomes vital. It's nearly impossible to program software to decipher the noise from what's real... but you can create a program capable of leaning. Most voice recognition that works well wasn't programed... it learned how to understand speech. Autonomous cars will improve in the same way.

This isn't the type of AI that Elon keeps warning us about... this is a very specific form of machine learning.

It will take time but it's continuous improvement. I have little doubt that in <10 years it'll be safer to let the car drive.

 
Currently we are a level 3, in a level 4 you can't ignore false positive.
In a level 3 you can, since the driver is the "net safety system", in a level 4 you don't have the net, so you can't discard information.

You can't have a unreliable sensor, since you can't drop the info it give you.

Make that the lidar see what a camera can't ( make it night and poor illumination ) but you think it could be a false positive since the camera and radar didn't see it, so you discard the data.. so what the point? you have a good sensor, that gave you good info, but you can't use it because it's unreliable..

Put the other case, you choose to do 2 ot of 3? ok, let's say that the radar has some false-positive and it's raining, so the lidar also has false-positive ( and this is sure. it will. it will not be a strange-coincidence ), so you act, but it was a false positive, then you are in the *sugar*. What good had lidar done to you? nothing.

You simply can't have an unreliable source, if you just say "ok, but the radar can have a false-positive", then you are wrong.
If a sensor can have false-positive, then it's less likely but even 10 sensor can have false-positive at the same time, using more sensor doesn't solve the problem, it only reduce the probability but it cost the double.

The good (and only) way is to solve the problem so you don't have false-positive.

lets take a hypothetical scenario of a tailor across a road, camera see it, lidar see it, radar does not see it

or another hypothetical scenario of a dust whirly on a road, camera does not see it, radar does not see it, Lidar saw it

or a bicycle crossing the road with the sun through it, radar see it, camera does not see it, lidar see it

1st is a false negative
2nd is a false positive
3rd is a false negative

false sensor readings are a fact of life
 
lets take a hypothetical scenario of a tailor across a road, camera see it, lidar see it, radar does not see it

or another hypothetical scenario of a dust whirly on a road, camera does not see it, radar does not see it, Lidar saw it

or a bicycle crossing the road with the sun through it, radar see it, camera does not see it, lidar see it

1st is a false negative
2nd is a false positive
3rd is a false negative

false sensor readings are a fact of life
What is a tailor? ( sorry, i'm from italy and sometime it get confusing.. )

You are at my side, only you doesn't recognize it! you made the perfect case.
If you exclude the lidar ( 2° case ), you SHOULD STOP EVERY TIME.

The only problem is the false positive, if you don't have false positive, you can't have false negative ( since you stop every time a sensor pick up something in your path ) and that's why is't not a good idea to add a sensor that is higly prone to false positive that's hard to fix
 
  • Like
Reactions: JeffK
What is a tailor? ( sorry, i'm from italy and sometime it get confusing.. )

You are at my side, only you doesn't recognize it! you made the perfect case.
If you exclude the lidar ( 2° case ), you SHOULD STOP EVERY TIME.

The only problem is the false positive, if you don't have false positive, you can't have false negative ( since you stop every time a sensor pick up something in your path ) and that's why is't not a good idea to add a sensor that is higly prone to false positive that's hard to fix

A tailor is someone who makes custom clothing. I don't think autopilot cares about a person's profession. I think @renim meant "trailer" as in truck trailer.
 
  • Informative
Reactions: cronosx