Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Drago Anguelov – Machine Learning for Autonomous Driving at Scale

This site may earn commission on affiliate links.
Correct me if I'm wrong, but your initial point was that lidar can "do" everything that cameras can with respect to FSD, read signs, markings, etc. Except recognizing traffic lights. To the point where you can use lidar alone to achieve FSD without HD maps, again except reading traffic lights?

I always get confused when you say "do" or "has".
 
Not sure what I am wrong about. Best lidar has 250 vertical lines of resolution. This isn't good enough to read road signs or markings at distances required for FSD based on lidar alone without HD maps.

As for your other points on lidar recognizing things, yes, lidar is used to create HD maps, so it's helpful that lidar can assist in detecting things at close distance so that the HD map can be created. This has nothing to do with your point that lidar can recognize road markings and signs as well as cameras. It's not even close.

tesla-autopilot-slide.jpg


Not sure if you're trolling, but lidar can't do the majority in that picture.

Ban Yourself! You are clueless
 
Do you understand that Lidar handles every single thing that camera does for autonomous other than seeing the color of traffic lights? I suppose by human level vision you are referring to cameras?

Here it is folks, bladerskb claiming that lidar can currently replace cameras for FSD, as long as the car has some way to read traffic lights, lol.

This, by the way, was what I initially disagreed about, and I'm right about my disagreement. Bladerskb totally ignored the poor vertical resolution of lidar, which means it can't read text on any road signs or markings at a distance. And in the case of road signs, it probably can't even read them until 10 feet away.
 
Here it is folks, bladerskb claiming that lidar can currently replace cameras for FSD, as long as the car has some way to read traffic lights, lol.

This, by the way, was what I initially disagreed about, and I'm right about my disagreement. Bladerskb totally ignored the poor vertical resolution of lidar, which means it can't read text on any road signs or markings at a distance. And in the case of road signs, it probably can't even read them until 10 feet away.

Redundancy is the main feature of a good FSD and this is why many companies are using Lidar, Vision, Radar, etc. Even if lidar could do absolutely everything, or vision could do everything, it's still a safer plan to use multiple sensors or the best features of each.

Even if lidar could read signs (vertical resolution is improving and angular resolution may be a more important spec anyway) it still may be a better idea to use vision to read the signs (perhaps after lidar finds them using reflectivity). Whichever gives the best outcome. Neither is always an all-encompassing solution, nor is that the expectation. As I see it anyway.

It would be nice if one system did everything but why should it? Use several for the widest range of weather, obstacle, and detection conditions. Lidar does many things much better than vision, and vision has some advantages over lidar. Use them both.
 
I think we should be comparing Tesla vision to lidar versus Vision to lidar.

The reason is Tesla vision still can't recognize, localize, and size estimate generic static objects on a consistent basis. It can't do it in the forward vision despite having three cameras with overlapping FOV areas.

This is vastly important because not everything on the road is going to be recognized by a neural network. So there needs to be some kind of generic "blob" where the car tries to avoid it if possible, and stop if bigger than a pre-set size if unable to avoid it.

Having front facing solid state lidar would solve a lot of Tesla's false braking, and late braking issues. Now that doesn't solve FSD because FSD needs to have redundancy. So it would force Tesla vision to improve to a point where the two systems combined made the system immune to a single fault failure from a perspective of safety. Having frontal lidar would give Tesla the ability to cross check the systems continuously.

On the sides/rear it might not need lidar where radar could augment the visual data well enough that a single fault failure could happen, and the car could still use the remaining sensors to safely exit the roadway. Where it went into some limp mode.
 
Not sure what I am wrong about. Best lidar has 250 vertical lines of resolution. This isn't good enough to read road signs or markings at distances required for FSD based on lidar alone without HD maps.

As for your other points on lidar recognizing things, yes, lidar is used to create HD maps, so it's helpful that lidar can assist in detecting things at close distance so that the HD map can be created by the human editor. This has nothing to do with your point that lidar can recognize road markings and signs as well as cameras. It's not even close.
lol no there are several lidars with over 1,000 vertical lines of resolution.
There are lidars with over 500 points per square degree.
Camera-like resolutions.
 
Last edited:
lol no there are several lidars with over 1,000 vertical lines of resolution.
There are lidars with over 500 points per square degree.
Camera-like resolutions.
Well Microvision is working on 944 vertical lines. Nothing released from them yet.

I prefer the Region Of Interest method of Lidar scanning. Sure get a general view at 10 Hz, but then focus on select areas at higher resolution and higher frequency. Like the human eye. Scan the scene, ignore the irrelevant, focus on the important. Plus the instant velocity measurements mean you can go straight to decision-making rather than complex image processing to determine if something is moving and how fast.

High vertical line resolution is less useful for autonomous driving than better vertical & horizontal angular resolution.

Again, image sensors are likely required too. Lidar==good, Vision==good. Both==better.
 
Lidar is resolution limited because of the way it works. It requires very precise manufacturing and sensors to accurately measure the laser beams. Plus, you have one drop of water near the lidar and the lasers all go out of whack.
 
Lidar is resolution limited because of the way it works. It requires very precise manufacturing and sensors to accurately measure the laser beams. Plus, you have one drop of water near the lidar and the lasers all go out of whack.
Lidar seems to do quite well in the rain.


"the three images in the top right are the structured data panoramic images output by the lidar sensor -- no camera involved. The top image displays ambient light (sunlight) captured by the lidar sensor. The second image displays the intensity signal (strength of laser light reflected back to the sensor), and the third image displays the range of objects calculated by the sensor.

You can see that water doesn't obscure the lidar signal and range images, even though there are water droplets on the lidar sensor’s window. The ambient image is a bit grainy because the cloud cover reduces the amount of sunlight, but still shows no impact from the rain."
 
Lidar seems to do quite well in the rain.

Sure, if you define a marketing video from Ouster as "well". The first 3 outputs on the top right look very low resolution to me. Also, since this is a marketing video, I don't know how the LIDAR is mounted or anything about its implementation. Who knows, maybe they applied some RainX (or some short duration oleophobic film) on the LIDAR right before the drive (or some other unpractical manipulation to improve rain performance)?

You have to take all marketing materials with a grain of salt.

Also, rain doesn't obscure LIDAR, it reduces its effective resolution even further because of the laser distortion. Also, you have to deal with rain residue on the LIDAR screen. The effects of rain will probably be much worse with solid state LIDAR. All the FSD developers using LIDAR are probably manually cleaning their sensors routinely.
 
Last edited:
  • Disagree
Reactions: Dan D.
Sure, if you define a marketing video from Ouster as "well". The first 3 outputs on the top right look very low resolution to me. Also, since this is a marketing video, I don't know how the LIDAR is mounted or anything about its implementation. Who knows, maybe they applied some RainX (or some short duration oleophobic film) on the LIDAR right before the drive (or some other unpractical manipulation to improve rain performance)?

You have to take all marketing materials with a grain of salt.

Also, rain doesn't obscure LIDAR, it reduces its effective resolution even further because of the laser distortion. Also, you have to deal with rain residue on the LIDAR screen. The effects of rain will probably be much worse with solid state LIDAR. All the FSD developers using LIDAR are probably manually cleaning their sensors routinely.
On the link they fully explained the setup, device, mounting and test procedure, did you miss reading the story in the link? It was more of an experimental result than marketing, but you are free to disbelieve either. Nothing seemed off to me.

They gave a good analysis of the results.

You said "Plus, you have one drop of water near the lidar and the lasers all go out of whack.", and I think this was a decent attempt to refute that statement. I'm not going to debate the point. We get that you don't like Lidar.

Lidar vs. camera: driving in the rain | Ouster
 
You said "Plus, you have one drop of water near the lidar and the lasers all go out of whack.", and I think this was a decent attempt to refute that statement. I'm not going to debate the point. We get

You're right, I was using too much hyperbole there, but my previous comments regarding taking marketing material with a grain of salt still stands. We actually don't know if Ouster did anything to improve their rain performance, since they're the one that's trying to promote their product. For example, if you look at their LIDAR window, it's immaculately clean, except for the small rain droplets. They're obviously creating ideal conditions for their marketing video. If we've learned anything from the Nikola fiasco, it's that we should be skeptical of marketing media.

Also, I wasn't saying that rain obscures lidar or that it won't work. I was point out that rain droplets do distort and scatter the laser beams, and it's not like many LIDARs have wipers installed, see the Ouster pic you linked.

They even pointed out the low resolution aspect of LIDAR:

Cameras bring high resolution to the table, where lidar sensors bring depth information.
 
You're right, I was using too much hyperbole there, but my previous comments regarding taking marketing material with a grain of salt still stands. We actually don't know if Ouster did anything to improve their rain performance, since they're the one that's trying to promote their product. For example, if you look at their LIDAR window, it's immaculately clean, except for the small rain droplets. They're obviously creating ideal conditions for their marketing video. If we've learned anything from the Nikola fiasco, it's that we should be skeptical of marketing media.

Also, I wasn't saying that rain obscures lidar or that it won't work. I was point out that rain droplets do distort and scatter the laser beams, and it's not like many LIDARs have wipers installed, see the Ouster pic you linked.

They even pointed out the low resolution aspect of LIDAR:

Cameras bring high resolution to the table, where lidar sensors bring depth information.
Thank you for reviewing the link. As you quoted from it, I shall as well:

...water doesn't obscure the lidar signal and range images, even though there are water droplets on the lidar sensor’s window
...The result is that the range of the sensor is reduced slightly by the water, but the water does not distort the image at all
...the sensor is less able to see the road surface at long ranges. That said, the range of the sensor is unaffected on all other objects (cars, buildings, trees, etc.).


Perhaps they are gaming the test, who knows. They are being honest in reporting the positives and negatives of Lidar and cameras so I give them some respect for that.

Shall we discuss the marketing honesty of Tesla next?
 
...water doesn't obscure the lidar signal and range images, even though there are water droplets on the lidar sensor’s window
...The result is that the range of the sensor is reduced slightly by the water, but the water does not distort the image at all

This is all a play on words if you think about it. No one is saying (except for their Twitter examples) that rain obscures LIDAR.
That fact that the range is reduced is BECAUSE of laser distortion and scattering.

You have to keep in mind that the article says "lidar sensors bring depth information" and "cameras bring high resolution". In the context of this statement, it's possible they only care that some LIDAR points return when it's raining. The fact is, less LIDAR points are returning, especially from far away, reducing the resolution at range.

You have to be skeptical of the wording of these marketing articles. For example, you pointed out (emphasis mine):

...The result is that the range of the sensor is reduced slightly by the water, but the water does not distort the image at all

They are ONLY referring to this one 3 minute video. Not some general statement.

It is possible that:

1) If the rain is slightly heavier, the result is much less usable
2) They cleaned the LIDAR window before the test
3) The rain droplets they showed in the picture are from AFTER the drive. It's possible that the 3 minute video were from the beginning of the drive, so there weren't as many droplets on the window.
4) And more possibilities!
 
Perhaps they are gaming the test, who knows. They are being honest in reporting the positives and negatives of Lidar and cameras so I give them some respect for that.

Shall we discuss the marketing honesty of Tesla next?

Don't you know that Tesla is the arbiter of truth? Anything they say is automatically true and canon no matter what. Anything anyone else says is lying and doing it for pr, marketing, stock and sales reasons. But not Tesla.

You said "Plus, you have one drop of water near the lidar and the lasers all go out of whack.", and I think this was a decent attempt to refute that statement. I'm not going to debate the point. We get that you don't like Lidar.

Lidar vs. camera: driving in the rain | Ouster
That's typical @powertoold right there. Ignorantly make an insane false statement as a fact, gets called out and easily proven wrong, then he claims he didn't make or mean that and tries to move the goal post/misdirect.
 
That's typical @powertoold right there. Ignorantly make an insane false statement as a fact, gets called out and easily proven wrong, then he claims he didn't make or mean that and tries to move the goal post/misdirect.

Lol, I was making that statement in the context of you claiming that lidar can be solely used for FSD, without cameras (like reading signs in the rain duh?).

What's worse? My factually true statement that lidar lasers go all out of whack with rain or your statement that lidar can currently be used in place of cameras for FSD (except for reading traffic lights).