diplomat33
Average guy who loves autonomous vehicles
I see three possibilities here:
1. Waymo has gone NN-crazy with few or no guardrails and this was simply a NN hallucination
The least likely, IMHO. Waymo/Google are NN leaders and have long done E2E work, but they understand the limitations and have even pointed some out publicly. I don't see them adopting such a flaky approach in deployed cars.
2. Waymo asked Fleet Response "can I proceed through this object?" and FR mistakenly said yes
Also unlikely. As @Daniel in SD points out, FR may be able to tell the car to proceed over a trash bag or through a low hanging branch, but ramming a giant pole could only happen with a terrible system design AND a grossly incompetent (or malevolent) remote monitor.
3. A bug in their heuristic guardrail code caused it to ignore a clearly detected object
Such a bug could be triggered by FR, just as a video game player might trigger a "wall" bug by running parallel to the wall and turning suddenly or jumping just before hitting the wall or whatever. We'll never know unless Waymo tells us. Which they should. I don't expect driving perfection, but I do expect "don't run straight into huge poles" perfection. This is a "Day 1" issue, a Ten Commandments violation. Such a basic failure goes straight to their core and shakes my confidence in their overall system design.
This is why Waymo needs to be more transparent and explain these incidents. Without an explanation from Waymo, the public will inevitably speculate which will lead to false theories, rumors etc... And public trust will be shaken as people will wonder if the tech is really reliable or not.