Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Bosch enters lidar business

This site may earn commission on affiliate links.
Lidar sensors are about to become a mainstream car feature

Bosch will start providing lidar units for production vehicles. At first they will likely replace front facing radar.

Yeah, I suspect that a front lidar will become standard. Tesla should really get with the program.

This section explains perfectly why a front lidar is so important. A front radar only is simply not adequate enough to deal with cases like stopped cars. Yeah, I know Tesla wants vision to handle these cases so lidar is not needed. But a front lidar for redundancy would still be a great idea.

"Right now, most advanced driver-assistance systems (ADAS) rely on radar as the primary way to track nearby objects. But radar sensors have severe limitations. They have low horizontal and vertical resolution, making them incapable of distinguishing a car directly ahead from a metal sign overhead or a concrete lane divider next to the road.

Many of today's ADAS systems cope with this by ignoring objects that are not moving relative to the road. That works most of the time on freeways, since other cars are usually moving, and large stationary objects are usually not in a travel lane. But it means that ADAS systems occasionally make mistakes like steering right into a parked fire truck.

Hence, lidar has the potential to substantially improve the performance of today's ADAS systems. Radar may not be able to distinguish a fire truck parked next to the travel lane from one parked in the travel lane. But a lidar sensor can. With help from lidar, the next generation of ADAS systems will better understand its environment and be able to avoid more crashes."
 
I don't know much about the technology but it does seem like lidar is the technology to use. Every robot from Boston Dynamics uses lidar to navigate the environment. I know Elon dismisses it, but not everything he says is the ultimate truth. Solid state lidar systems are looking promising.
 
  • Informative
Reactions: APotatoGod
It just comes down to whether a time-of-flight ranging sensor is really needed for an accurate depth map/point cloud, or depth-from-vision (i.e. ConvNets) has gotten good enough that adding ranging sensors is just superfluous data. Depth from vision isn't always the most accurate, sometimes it can produce crazy results. Front-facing vision on Teslas with the radar fused in always seem to have rock solid depth predictions, I never see front cars bouncing around or anything. I don't think adding front-facing LIDAR would help Tesla's case. On the other hand, with the side-facing repeater cameras that are purely vision from single cam, there are many cases where the depth prediction goes nuts. Like if vehicles fill the entire camera frame and edges are occluded, the confidence of the vision system goes way down. Maybe there's an argument to be made for adding rear and/or side radar, or maybe future releases will improve the vision performance for side/rear cams with a better model.
 
  • Like
Reactions: pilotSteve
yes, lets replace radar (that can see in all weather conditions) with something that does not work well in: rain, snow, fog
My bet is on improving radar rather then going with lidar...

I think the best solution, which pretty much every self-driving company is doing, is to use all three. Put multiple cameras and a radar and a lidar in the front of the car. That way you have triple redundancy in the most important direction (direction of motion).
 
It just comes down to whether a time-of-flight ranging sensor is really needed for an accurate depth map/point cloud, or depth-from-vision (i.e. ConvNets) has gotten good enough that adding ranging sensors is just superfluous data. Depth from vision isn't always the most accurate, sometimes it can produce crazy results. Front-facing vision on Teslas with the radar fused in always seem to have rock solid depth predictions, I never see front cars bouncing around or anything. I don't think adding front-facing LIDAR would help Tesla's case. On the other hand, with the side-facing repeater cameras that are purely vision from single cam, there are many cases where the depth prediction goes nuts. Like if vehicles fill the entire camera frame and edges are occluded, the confidence of the vision system goes way down. Maybe there's an argument to be made for adding rear and/or side radar, or maybe future releases will improve the vision performance for side/rear cams with a better model.
Clearly Tesla has not gotten depth from vision and radar to work well enough to make a self driving car. To me, RADAR is way more of a crutch than LiDAR since it allows you to easily make a system that works most of the time but not nearly well enough for an autonomous vehicle. LiDAR would certainly reduce the number of firetrucks the car hits.
A purely vision based system is obviously possible but often the best engineering solutions don’t come from nature. People tried to make airplanes that fly like birds and sewing machines that sew like humans before inventing successful designs.
 
Honestly, I don't understand the opposition to lidar, other than maybe some Tesla fans just want to defend Elon at all cost. Even if camera vision only does works for self-driving, you would still need extra redundancy because "good enough" is not good enough. Why would you not want the best solution with the most redundancy and highest reliability, especially for something as potentially dangerous as an autonomous car. So it makes all the sense in the world to add extra sensors to make sure the car has extra layers of safety. I got news for you. I am never sleeping in the back seat of any autonomous car unless I know it is 1000% safe. Plus, every serious autonomous driving company employs cameras, radar and lidar covering all angles of the car. I think they know what they are doing.
 
Clearly Tesla has not gotten depth from vision and radar to work well enough to make a self driving car. To me, RADAR is way more of a crutch than LiDAR since it allows you to easily make a system that works most of the time but not nearly well enough for an autonomous vehicle. LiDAR would certainly reduce the number of firetrucks the car hits.
A purely vision based system is obviously possible but often the best engineering solutions don’t come from nature. People tried to make airplanes that fly like birds and sewing machines that sew like humans before inventing successful designs.

Is depth the roadblock to shipping Tesla FSD? I don't think so, they have shipped aggressive lane change behaviors where judging depth from a single camera is critical. It's the difference between the surrounding traffic being placed in the adjacent lane versus one lane over. And that's the hardest case, front-facing depth is much easier. If depth isn't the issue today then what are time-of-flight sensors needed for exactly?

Autopilot hits stopped cars when traveling at highway speeds because they are averse to false positives like all emergency braking systems. I don't see how a time-of-flight laser will make it better, it's not judging depth that's the issue, it's a problem of classification. You can't slam on the brakes on the highway because LIDAR saw some point returns that belong to a firetruck in the far distance, you need vision to classify the points as something that is going to ruin your day or not.

Radar is preferable to supplement vision-based-depth because vision and near-IR LIDAR both sit near the visible light part of the spectrum and have the same physics. Radar actually conveys new information because the absorption/reflectance is radically different and you are able to see unseen vehicles via indirect returns that are completely occluded to cameras and LIDAR and it penetrates weather more reliably.
 
Is depth the roadblock to shipping Tesla FSD? I don't think so, they have shipped aggressive lane change behaviors where judging depth from a single camera is critical. It's the difference between the surrounding traffic being placed in the adjacent lane versus one lane over.

Autopilot hits stopped cars when traveling at highway speeds because they are averse to false positives like all emergency braking systems. I don't see how a time-of-flight laser will make it better, it's not judging depth that's the issue, it's a problem of classification. You can't slam on the brakes on the highway because LIDAR saw some point returns that belong to a firetruck in the far distance, you need vision to classify the points as something that is going to ruin your day or not.

Radar is preferable to supplement vision-based-depth because vision and near-IR LIDAR both sit near the visible light part of the spectrum and have the same physics. Radar actually conveys new information because the absorption/reflectance is radically different and you are able to see unseen vehicles via indirect returns that are completely occluded to cameras and LIDAR and it penetrates weather more reliably.
I'm no expert but what object would be anything close to the shape of a firetruck or car and ok to run into? I agree things get much trickier when you're talking about tumbleweeds and plastic bags.
I doubt random RADAR reflections are beneficial to self driving vehicles. I thought there have been advancements in processing that allows LiDAR to work in more adverse weather conditions than earlier implementations.
 
I'm no expert but what object would be anything close to the shape of a firetruck or car and ok to run into? I agree things get much trickier when you're talking about tumbleweeds and plastic bags.
I doubt random RADAR reflections are beneficial to self driving vehicles. I thought there have been advancements in processing that allows LiDAR to work in more adverse weather conditions than earlier implementations.

Because beyond 30 meters, the reality is your LIDAR will give you maybe one or two points that belong to the firetruck. Two points is enough to classify a line, and that's about it. It doesn't matter how fancy your machine learning is.
 
  • Informative
Reactions: APotatoGod
Because beyond 30 meters, the reality is your LIDAR will give you maybe one or two points that belong to the firetruck. Two points is enough to classify a line, and that's about it. It doesn't matter how fancy your machine learning is.
Maybe my math is wrong but quick search shows a horizontal resolution of 0.08 degrees and 0.4 degrees vertical for a Velodyne LiDAR (HDL-64E | Velodyne Lidar) which would be 4cm horizontal resolution and 21cm vertical resolution at 30m.
 
  • Like
Reactions: David99
which would be 4cm horizontal resolution and 21cm vertical resolution at 30m.

Maybe we are talking about toy fire trucks?

I did hear from someone in the industry that the Velodyne radars had 100% RMA rate over the first year, but not sure what drove that number. Of course that someone was looking for a simpler LiDAR implementation so they had reason to besmirch them. Maybe they are better now.
 
  • Informative
Reactions: APotatoGod
Maybe my math is wrong but quick search shows a horizontal resolution of 0.08 degrees and 0.4 degrees vertical for a Velodyne LiDAR (HDL-64E | Velodyne Lidar) which would be 4cm horizontal resolution and 21cm vertical resolution at 30m.

Yeah there are monster LIDARs like that but that's not what this Bosch thing is or anything you'd find on a production car. It's totally impractical to use that thing. Plus if you operate it at full resolution the rotation rate drops to only 5 hz which is too slow for highway driving.
 
Yeah there are monster LIDARs like that but that's not what this Bosch thing is or anything you'd find on a production car. It's totally impractical to use that thing. Plus if you operate it at full resolution the rotation rate drops to only 5 hz which is too slow for highway driving.

The Audi A8 has the Valeo apparently which is 0.25 degrees horizontal resolution apparently. So that would be 13cm at 30 meters. I guess it must not operate at a high rate since it was only for the TrafficJam thing. But technology marches on.
 
Last edited:
Yeah there are monster LIDARs like that but that's not what this Bosch thing is or anything you'd find on a production car. It's totally impractical to use that thing. Plus if you operate it at full resolution the rotation rate drops to only 5 hz which is too slow for highway driving.
I dunno. Seems like scaling down cost and size of technology that already exists is a better bet than assuming that vision based SLAM will have major improvements in accuracy in the near future. Tesla has been working on the problem for years and Autopilot still hits things that LiDAR would have no problem detecting.
 
  • Like
Reactions: AlanSubie4Life