Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I found an example of a stereo vision system that was able to estimate the distance of objects 50 metres away with an error of 3.2 cameras:

A 3D stereo camera system for precisely positioning animals in space and time

It was also accurate to 9 cm at 5 metres away.

However, the baseline (distance between cameras) was 50 cm, much larger than Tesla or Mobileye have on their front-facing cameras.

Strange that Tesla and Mobileye both converged on short baseline trinocular cameras, since the top of a car windshield is over 100 cm wide and so could have provided a longer baseline. Hmm...

Maybe it’s true that it would be too difficult to calibrate the cameras for stereo vision. Or maybe they chose a short baseline because stereo vision will only be used for short distances (like human stereo vision) anyway.

In terms of human distance estimation at distances beyond 50 m or 100 m, I want to turn the discussion into falsifiable claims. But I don’t know exactly how you would test the accuracy of human distance perception in a way that is relevant to this discussion.

At 80 km/h, you’re travelling 22.2 metres per second. So a difference of 10 metres (e.g. a car being 100 m away or 100 m away) translates to a 450 ms difference in the amount of time needed to stop without hitting the car in front of you.

Human reaction time is somewhere in the 200-300 ms range under ideal conditions. So 200-300 ms is, for our purposes, the human quantum of time. Intuitively, it’s hard for me to believe that human drivers can estimate distance so accurately that they can tell the difference between 1 quantum of time of stopping time. It’s hard to say because human drivers don’t even try to estimate safe distance distances, they just crash and die instead.

Anyway it seems like you need monocular vision for distances beyond ~50 m (although I found lots of ambiguous confusing information I didn’t understand when I looked at a bunch papers on the topic).

A point I don’t want to be missed is that the level of accuracy you need changes with distance. For objects within 10 m, 10 cm of accuracy is plenty. For objects further than 100 m, 10 m of accuracy is probably plenty. A self-driving car can just add another 10 m to the distance it keeps between itself and cars ahead. That compensates for the error.

At 80 km/h, a safe stopping distance is already around 50 m, so adding 10 m isn’t that much of a change. At 110 km/h, it’s around 90 m.
 
Last edited:
I found an example of a stereo vision system that was able to estimate the distance of objects 50 metres away with an error of 3.2 cameras:
A 3D stereo camera system for precisely positioning animals in space and time
It was also accurate to 9 cm at 5 metres away.
However, the baseline (distance between cameras) was 50 cm, much larger than Tesla or Mobileye have on their front-facing cameras.

"Subaru EyeSight comprises a pair of day/night video cameras mounted at the top of windshield, 14 inches (36cm) apart, five times farther apart than your eyes (to give it great depth perception)" via Subaru Forester review: The best small SUV thanks to EyeSight - ExtremeTech

Picture below via: EyeSight
eyesight-hero.jpg
 
  • Informative
Reactions: strangecosmos
Phase detection used in single lens reflex camera (SLR or DSLR) can estimate distance from input of a single lens. Lights from top and bottom edges of the lens reaching different pairs of phase detection sensors can provide enough separation for distance calculation. You don't need a lot of separation from two lenses to get enough info to do that.
 
  • Informative
Reactions: kbM3 and mspohr
Phase detection used in single lens reflex camera (SLR or DSLR) can estimate distance from input of a single lens. Lights from top and bottom edges of the lens reaching different pairs of phase detection sensors can provide enough separation for distance calculation. You don't need a lot of separation from two lenses to get enough info to do that.

Teslas do not have phase detection sensors, which are an important part of the SLR autofocus system. Phase detection also doesn't give you a distance to target; it tells you what direction and how much to adjust the focus to bring a particular area of the image into focus, which requires (a) a focusable lens, and (b) having some parts of the image out of focus in order to focus the area of interest. It manages to do this with very little effective separation (only the diameter of the main lens) by using mechanically complex and somewhat bulky optics (beam splitter, etc) and microlenses to amplify the difference in angle seen to an object by opposite sides of the lens.

It would be difficult to make this work with multiple cameras because of the very rigid alignment required. And Tesla couldn't do it with a single camera without going to a much larger lens, because the lens diameter is what determines how much angular disparity you get across the lens. And as I mentioned, you'd have to defocus one part of the scene in order to focus another; automotive cameras are typically set up to have everything in focus from a certain minimum distance to infinity.

Every single aspect of autonomous driving, when you dive into the details, is harder than people imagine. And there are many aspects of it to dive into.
 
  • Like
Reactions: Kant.Ing
I know Tesla does not use phase detection. I used that example to illustrate you don't need two light path with a lot of separation to obtain enough distance info. BTW phase detection does give you distance information to perform the autofocus. Some systems will do a second confirmation measurement after lens focus is moved to that position but many don't.
 
Last edited:
It is the Tesla way to get rid of what does not work instead of fixing it. :)
I think radar works fine. If it's not needed for AP, then it's fine to get rid of it.
Other companies pick technology and stick with it even when it's a dead end. Tesla's not afraid to ditch technology that isn't needed or working. Would any LIDAR based car company even consider trying to get AP to work without LIDAR?
 
@mspohr It is a known issue that certain kind of snow blocks the bumper covered Tesla radar since no heating on the bumper. That is what the Twitter thread is about too so that is what my fix referred to. In seriousness Musk did refer to adding a heater in another recent tweet so they still may do that.
 


That was my twitter post in that thread too TeslaS100Dfan It is definitely NOT snow only. Every time it rains beyond a mist I lose TACC and A/P on my commute. Even posted a picture of my car's ICU displaying "Cruise not available" AKA no diver assistance features available. :/ Elon's response back "Applying a hydrophobic coating to the radar" doesn't work as I have PPF AND CeramicPro gold and water sheds like duck feathers.
 
That was my twitter post in that thread too TeslaS100Dfan It is definitely NOT snow only. Every time it rains beyond a mist I lose TACC and A/P on my commute. Even posted a picture of my car's ICU displaying "Cruise not available" AKA no diver assistance features available. :/ Elon's response back "Applying a hydrophobic coating to the radar" doesn't work as I have PPF AND CeramicPro gold and water sheds like duck feathers.

That's odd, radar is usually pretty good in rain, and I haven't noticed this in my own car. Maybe you have a hardware problem -- perhaps water or humidity is getting through a seal somewhere and mucking with the electronics.
 
That's odd, radar is usually pretty good in rain, and I haven't noticed this in my own car. Maybe you have a hardware problem -- perhaps water or humidity is getting through a seal somewhere and mucking with the electronics.

And my Tesla Service Center has checked it out numerous times and said everything is working correctly. It must be fairly common if Elon took the time to say "We’re also working on vision-only driving" in the same tweet as if there must be a time and place where radar is not functioning at all or enough to be trusted.
 
That was my twitter post in that thread too TeslaS100Dfan It is definitely NOT snow only. Every time it rains beyond a mist I lose TACC and A/P on my commute. Even posted a picture of my car's ICU displaying "Cruise not available" AKA no diver assistance features available. :/ Elon's response back "Applying a hydrophobic coating to the radar" doesn't work as I have PPF AND CeramicPro gold and water sheds like duck feathers.
Here's my car in snow. The radar is covered but the camera is clear. If the AP only needed the camera, it would be fine.
IMG_20160105_095911.jpg