Thanks for the link.
The article is very informative but it lumps all accidents whether Waymo hits others or others hit it and the confusing whether the collision did actually happen or just "simulated".
When I said collisions, I mean the Autonomous Vehicle is the one that is the collider, the culprit that moves to hit the victim. When the Autonomous Vehicle is a victim because others move and hit it then I don't count it as collisions at the automation's system fault.
An example is:
"Waymo cars were involved in one actual and two simulated events (i.e., events triggered by a disengagement) in which a pedestrian or cyclist struck stationary Waymo cars at low speeds."
In this scenario, Waymo was hit by a pedestrian or cyclist. The pedestrian or cyclist was not hit by Waymo.
An example is when a distracted pedestrian walked and
smashed his face into the side of a moving firetruck with loud siren. To me, that's not the fault of the firetruck but this article would count that as a collision too!
Notice, in California, any interruption with the Autonomous system is called "disengagement". This article call those disengagement as "simulated events (i.e., events triggered by a disengagement)".
This article calls collisions "actual" and "simulated" which makes the readers think that there are more actual collisions.
An example when Waymo is not the culprit but as victim in rear-end collisions:
"Waymo reported 11 actual rear-end collisions involving its cars and one simulated collision. In eight of the actual collisions, another car struck a Waymo car while it was stopped; in two of the actual collisions, another car struck a Waymo car moving at slow speeds; and in one of the actual collisions, another car struck a Waymo car while it was decelerating. The simulated collision modeled a Waymo car striking a decelerating car."
Actual: 11 including:
8: Waymo was stationary, others hit it from the rear
2: Waymo was at slow speed and others hit it from the rear
1: Waymo was decelerating and others hit it from the rear
Simulated: 1
The model predicted that if Waymo kept decelerating at the current rate, the other would hit it from the rear. But it never occurred because it's a model, it's simulated.
In all, of the rear-end collisions example above (actual or simulated), Waymo is the victim and not the one who hits others.
So, first thing first, Autonomous Vehicles need to have the capability to avoid hitting others first.
Then the next step is intelligence: How to make sure others don't hit them.
Like in the case of the firetruck, first it should not hit others. But others like the distracted pedestrian was walking toward the firetruck, maybe it should spray water at the pedestrian to wake him up.