Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

How does Autopilot leverage Continental radar for Autopilot?

This site may earn commission on affiliate links.
I know there has been a lot of debate over LiDAR vs. cameras with object detection, but I didn't realize that Autopilot leverages radar. I find this very odd, since radar 'sees' many types of materials and objects as invisible, or worse, much larger than they are in real live - not very useful for avoiding objects. How exactly is Tesla leveraging Continental radar for Autopilot?

At this point, iPhones/iPads have LiDAR, and I'm sure so do other consumer devices. Why not simply remove the radar and replace with LiDAR, or use both, understanding that LiDAR doesn't work in fog/rain very well. This debate just seems to be absurd if Tesla already has sensors beyond cameras...they have already admitted that cameras alone aren't enough.
 
What I'd expect to happen is that the cameras combined with the radar create a better 3D rendering - the radar is much better in detecting accurate distance to objects as opposed to the cameras, and the cameras are much better in determining what the object is.
 
Last edited:
  • Like
Reactions: Watts_Up
What I'd expect to happen is that the cameras combined with the radar create a better 3D rendering - the radar is much better in detecting accurate distance to objects as opposed to the cameras, and the cameras are much better in determining what the object is.

Ah, that makes sense...radar (when it can detect an object that is radar-friendly) provides distance, even in bad weather, and the cameras perform the object recognition. So the question that remains is what happens when radar can't determine the object is there, and the cameras can't see it? Still seems that having LiDAR as a fail-safe is better than not.
 
The radar can be bounced too so your Tesla can "see" beyond the car in front of you.

Upgrading Autopilot: Seeing the World in Radar

Taking this one step further, a Tesla will also be able to bounce the radar signal under a vehicle in front - using the radar pulse signature and photon time of flight to distinguish the signal - and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.
 
The radar can be bounced too so your Tesla can "see" beyond the car in front of you.

Upgrading Autopilot: Seeing the World in Radar

Taking this one step further, a Tesla will also be able to bounce the radar signal under a vehicle in front - using the radar pulse signature and photon time of flight to distinguish the signal - and still brake even when trailing a car that is opaque to both vision and radar. The car in front might hit the UFO in dense fog, but the Tesla will not.

Really great article, thanks for the link!
 
What I'd expect to happen is that the cameras combined with the radar create a better 3D rendering - the radar is much better in detecting accurate distance to objects as opposed to the cameras, and the cameras are much better in determining what the object is
If Tesla wants a 3D rendering, why doesn't it install two forward-facing cameras? That would help it with the 3D part just like our two eyes help our brains give us a 3D concept of our surroundings.
 
If Tesla wants a 3D rendering, why doesn't it install two forward-facing cameras? That would help it with the 3D part just like our two eyes help our brains give us a 3D concept of our surroundings.
I believe there might even be 3 front facing cameras (not sure, what looks like the middle one might be another sensor). Still what's easy for us to do (3D vision and assessing distance) is way more complex for a computer using two cameras that are only 6 inch or so apart. It involves correctly determining very small differences in viewing angles for the same object.
 
I believe there might even be 3 front facing cameras (not sure, what looks like the middle one might be another sensor). Still what's easy for us to do (3D vision and assessing distance) is way more complex for a computer using two cameras that are only 6 inch or so apart. It involves correctly determining very small differences in viewing angles for the same object.

Yes...there are three cameras in the front. Main Forward, Wide Forward and Narrow Forward cameras. You can see all three in the MY if you step outside and look at the top center of your windshield right in front of the rear view mirror.