This math only works if each system is 100% capable of the task by itself, and uses completely independent logic. The current challenge in autonomy is not the hardware reliability, but the software/algorithm capability. In reality, combining two "independent" systems never ends up with simple combinational math given common mode failures and other similarities. One of the tasks any system that gets to human levels will have to handle is a tire blowout or loss of propulsion- which dealing with has very little to do with perception sensor modalities.
You have to do really, really careful analysis to know if this is true. On top of this, by doing an "OR" system, it now means you have doubled your false positive rate- either system telling you to do something must be obeyed, and this causes really complex issues when one is telling you to steer out of your lane to not hit an object and the other is telling you there's an oncoming car in that lane. Which one do you trust? What do you do when the camera makes you do an emergency stop for a plastic bag in the road? The fact that they are doing sensor "fusion" means there is an algorithm there that is not just an "OR" and the system reliability comes from that algorithm's performance, not just combinations of sensor MTBFs.