Mobileye’s approach makes no sense. The way Amnon explains it is also nonsensical. I believe they’re only taking this “different” approach as a marketing ploy to say they’re special.
I thought it is easy to understand.
MobilEye was visionary on how machines could drive with just cameras and there was no need for other sensors. It was able to convince Tesla to use its system.
MobilEye wanted Autonomous vehicles too but its approach was step by step, very conservative.
It thought instead of trying to figure out with all the sensors, it wanted to concentrated on vision first and work out the bugs without the distraction of other sensors (If there's a fault, it's camera system, not radar, not sonars because there's only vision to work with.)
It's similar to figuring out which is clarinet and which is flute by just one sensor: the ears and shut your eyes. If the guessing is wrong, it's the ears' fault because the eyes are shut and not giving guesses.
Once perfected by ears, then shut the ears and open the eyes and tell the difference of clarinet and flute.
With fusion, the eyes and the ears help each other at the same time to guess which is which.
Now back to the history in 2014, it sold Tesla an ADAS that required driver's attention on the road but when Tesla first sold the Autopilot, Tesla was too enthusiastic and started to talk about how Autopilot could become Autonomous Vehicle someday.
MobilEye didn't like that kind of talk that might confuse ADAS with an Autonomous Vehicle system.
In 2016, with the first Autopilot death, MobilEye didn't want to have anything with Tesla anymore for fear of being associated with evangelizing recklessness in overselling an ADAS system as if it can do more or worse as FSD.
MobilEye continued to perfect its ADAS while researching on the Autonomous system with the vision alone.
That doesn't mean MobilEye won't use other sensors once it's perfected the vision. Now, it's adding in Radar + Lidar for commercial fleets first then later in 2025 for consumers.
I think MobilEye redundancy is not just like a traditional kind of "redundancy" like Tesla that would have like 2 same chips on a board and each chip has its own power circuit in case one circuit fails and the other could take over.
MobilEye redundancy is just like if the camera thought it's a white sky in the 2016 fatal Autopilot vs Tractor-Trailer accident and the camera failed to initiate braking, the other independent system of Radar + Lidar would step in and initiate braking.
Both independent systems would run at the same time, when one fails to detect the danger, the other one would step in and intervene.
It's not fusion because each system does not need one another to identify danger (Fusion is just like when the ears ask the eyes to confirm it's flute not clarinet).
It's not traditional redundancy because these 2 systems are not exact duplicated (2 same chips, 2 same programs...).
Mistaken white sky is only one among other camera failures.
Camera fails to detect the bus and the car to due sun too low on the horizon but LIDAR has no problem:
Camera is failing to detect 2 pedestrians in low light but LIDAR has no problem: