Camera - Struggles in Low Light conditions and direct sunlight
Lidar - Excels in Low light conditions and direct sunlight
Different fail modes.
The camera Tesla has 115 dB dynamic range and the human eye has a dynamic contrast ratio of about 1,000,000:1 or 120 dB.
But the most important thing being resolution. Tesla camera has 1.2 megapixels and can't even read a speed limit sign 100 meters away in broad day light.
First, it says ">115 dB". Second, you're mixing up the static and dynamic contrast ratio of that camera.
The camera can provide only 20-bit data, which is about a 120 dB contrast ratio, hence ">115 dB". That limitation is determined by the maximum output signal bit depth, and that is effectively its best-case
static contrast ratio (combining three frames with HDR)
in a single frame.
The dynamic contrast ratio includes the static contrast ratio as altered by:
- The iris, assuming the camera has one.
- The exposure time (which any camera can vary, but your eye can't usefully)
- Changing the gain (which any camera can do, but your eye only can to a very limited degree very slowly via rhodopsin and bleaching)
So you're comparing the dynamic contrast ratio of your eye to the static contrast range of the camera. The dynamic contrast range of the camera is way, way wider than 120 dB.
It can, in a single HDR frame, represent very nearly the entire dynamic contrast ratio that your eye can see.
The static ratio of the camera in linear (non-HDR) mode is 72.24 dB. The static ratio of your eye is, if I'm understanding correctly, only about 40 dB.
The camera stomps your eye into the ground.
The human eye on the other hand has 576 megapixels.
And low-quality optics whose angular resolution renders that largely moot. Also, that's 576 megapixels in one direction. Your car has cameras pointing in every direction. Also, remember that it just has to be better than the legal threshold, which in most states is 20/40.
Also, your eyes do not actually see 576 MP in a single image; that's based on producing a full image that would present full resolution for your entire eye's field of view. However, you only have high resolution
near the center of your eye; everywhere else, your vision is absolute crap. The effective resolution of a single frame of the
dense part of the human eye is only on the order of
5 to 15 MP. That's still better than the Tesla cameras at 1.2 MP, but only by potentially as little as a factor of 4 (which would only be 2x the resolution in each direction).
And in the only direction where long-distance vision really matters much (the direction your car is moving in), your car also has a zoomed-in camera (and a wide-angle camera). The 35mm-equivalent focal length of your eye is about 43mm. The cameras on a Tesla are about 6mm, 46mm, and 69mm. That last one effectively gives it about the same single-frame resolution at a distance as the worst-case estimate for human eyes. (The wide-angle camera is probably mostly useless.)
Also, computers can do inter-frame superresolution to resolve detail way smaller than you can resolve in a single frame. Your brain really doesn't do much of that.
Also, your brain is mostly only usefully paying attention to the center of your field of view. You can almost entirely miss stuff going on in your peripheral vision (where the image quality is crap) that a computer could easily see by virtue of having way more cameras pointing in more directions, even at lower resolution (but still higher resolution and more in-focus than your peripheral vision).
In short, the cameras win again.
The whole point is that the probability of them failing at the same same is extremely low. I have had my lights burn out on me acouple times while driving over the years. You are claiming that my lidar immediately go out at the exact same time.
Both lights and the high/low-beam failover? Unlikely. For headlights to fail, you have to lose four different lighting elements. The wiring or switch, maybe, particularly if there's a design defect, but that's probably at most a once-in-a-billion-miles thing, particularly with modern LED headlamps.
And if the failure is caused by a 12V electrical system failure, your LIDAR is probably going down, too, so it isn't entirely independent.
Its funny that people like you claim there's no redundancy or back up here. You say things like if the camera says theres no ped and the lidar/radar says there is, which one should you believe?
This is like saying, why have two guards at the outpost, what if one guard sees someone move in the bushes for a moment and the other guard didn't.. which one should you believe? Its absurd because you investigate what either guard sees.
None of these are realistic scenarios or comparisons. Two guards is like two cameras. RADAR and LIDAR are more like a guard and a guard dog. After the hundredth time the dog guard goes nuts because of a stray cat, you shoot the guard dog and hire a second guard. That's what RADAR is like. If RADAR says there's a pedestrian, it's more than likely a pothole.
LIDAR is somewhat better, sure, but the burden of proof is still on you to show that there are realistic, common scenarios in which such a detection failure would occur with cameras over a large enough number of frames to result in an accident that can't be fixed by having a second camera from a different angle (remembering that there are already three cameras in the only direction that really matters much).
The rest of your post just continues to repeat the same fallacy in different words.