strangecosmos
Non-Member
I found an example of a stereo vision system that was able to estimate the distance of objects 50 metres away with an error of 3.2 cameras:
A 3D stereo camera system for precisely positioning animals in space and time
It was also accurate to 9 cm at 5 metres away.
However, the baseline (distance between cameras) was 50 cm, much larger than Tesla or Mobileye have on their front-facing cameras.
Strange that Tesla and Mobileye both converged on short baseline trinocular cameras, since the top of a car windshield is over 100 cm wide and so could have provided a longer baseline. Hmm...
Maybe it’s true that it would be too difficult to calibrate the cameras for stereo vision. Or maybe they chose a short baseline because stereo vision will only be used for short distances (like human stereo vision) anyway.
In terms of human distance estimation at distances beyond 50 m or 100 m, I want to turn the discussion into falsifiable claims. But I don’t know exactly how you would test the accuracy of human distance perception in a way that is relevant to this discussion.
At 80 km/h, you’re travelling 22.2 metres per second. So a difference of 10 metres (e.g. a car being 100 m away or 100 m away) translates to a 450 ms difference in the amount of time needed to stop without hitting the car in front of you.
Human reaction time is somewhere in the 200-300 ms range under ideal conditions. So 200-300 ms is, for our purposes, the human quantum of time. Intuitively, it’s hard for me to believe that human drivers can estimate distance so accurately that they can tell the difference between 1 quantum of time of stopping time. It’s hard to say because human drivers don’t even try to estimate safe distance distances, they just crash and die instead.
Anyway it seems like you need monocular vision for distances beyond ~50 m (although I found lots of ambiguous confusing information I didn’t understand when I looked at a bunch papers on the topic).
A point I don’t want to be missed is that the level of accuracy you need changes with distance. For objects within 10 m, 10 cm of accuracy is plenty. For objects further than 100 m, 10 m of accuracy is probably plenty. A self-driving car can just add another 10 m to the distance it keeps between itself and cars ahead. That compensates for the error.
At 80 km/h, a safe stopping distance is already around 50 m, so adding 10 m isn’t that much of a change. At 110 km/h, it’s around 90 m.
A 3D stereo camera system for precisely positioning animals in space and time
It was also accurate to 9 cm at 5 metres away.
However, the baseline (distance between cameras) was 50 cm, much larger than Tesla or Mobileye have on their front-facing cameras.
Strange that Tesla and Mobileye both converged on short baseline trinocular cameras, since the top of a car windshield is over 100 cm wide and so could have provided a longer baseline. Hmm...
Maybe it’s true that it would be too difficult to calibrate the cameras for stereo vision. Or maybe they chose a short baseline because stereo vision will only be used for short distances (like human stereo vision) anyway.
In terms of human distance estimation at distances beyond 50 m or 100 m, I want to turn the discussion into falsifiable claims. But I don’t know exactly how you would test the accuracy of human distance perception in a way that is relevant to this discussion.
At 80 km/h, you’re travelling 22.2 metres per second. So a difference of 10 metres (e.g. a car being 100 m away or 100 m away) translates to a 450 ms difference in the amount of time needed to stop without hitting the car in front of you.
Human reaction time is somewhere in the 200-300 ms range under ideal conditions. So 200-300 ms is, for our purposes, the human quantum of time. Intuitively, it’s hard for me to believe that human drivers can estimate distance so accurately that they can tell the difference between 1 quantum of time of stopping time. It’s hard to say because human drivers don’t even try to estimate safe distance distances, they just crash and die instead.
Anyway it seems like you need monocular vision for distances beyond ~50 m (although I found lots of ambiguous confusing information I didn’t understand when I looked at a bunch papers on the topic).
A point I don’t want to be missed is that the level of accuracy you need changes with distance. For objects within 10 m, 10 cm of accuracy is plenty. For objects further than 100 m, 10 m of accuracy is probably plenty. A self-driving car can just add another 10 m to the distance it keeps between itself and cars ahead. That compensates for the error.
At 80 km/h, a safe stopping distance is already around 50 m, so adding 10 m isn’t that much of a change. At 110 km/h, it’s around 90 m.
Last edited: