Everyone is using cameras though. Who is using a Lidar-centric system? Everyone, including Tesla, is using a a combination of sensors.
"so if you look at a suite that for example Tesla is using which is ultrasonic radar and camera and you compare it to just lidar and see how these paths compare that actually the suite of camera radar and ultrasonic are comparable to lidar so that those are the two comparisons that we have you have the costly non machine-learning way of lidar and you have the cheap but needs a lot of data and is not
explainable reliable in the near term vision based approach and those are the two competing approaches. Now of course Waymo will talk about they're trying to use both but ultimately the question is who catches, who is the failsafe
in the semi autonomous way when there's a camera based method the human is the failsafe when you say oh crap I don't know what to do the human catches it.
In the fully autonomous mode, so what Waymo's working on, and others, the failsafe is lidar, failsafe is maps, that you can't rely on the human but you know this roads so well that if the camera is freaked out if there's any of the sensors freaked out that you're able to you have such good maps you have such good accurate sensors that the fundamental problem of obstacle avoidance which is what safety is about can be solved, the question is what kind of experience that creates."
It sounds like he's skeptical that the a camera only approach can be safe enough for fully autonomous vehicles in the near term. Autopilot is still running into fixed objects even after 1 billion miles with the current sensor suite. Maybe HW3 will fix it.
Trying to game California's autonomous testing rules didn't work out so well for Uber. For Uber it was videos of their cars running red lights that caused the DMV and attorney general to bring the hammer down. We'll see what happens if Tesla tries a similar testing strategy.
P.S. Youtube transcribed it! Neural nets are awesome