You're still going to interact with humans even if every car on the road is autonomous. I don't see pedestrians or bicycles going away.
I'm pointing to real world systems that are implemented in far more ideal circumstances (same protocols for collision avoidance with a great deal of effort to make sure two different planes don't recommend evasive actions that increase the likelihood of a collision) and yet there are situations where the evasive actions recommended increase the likelihood of a crash.
But I need not even look very far to find a TACC accident:
3 day old import P85D crashed while using TACC
That particular accident happened because of an implementation detail of Tesla's TACC. Stopped objects are presumed not to be in your way. If it sees the object moving and then it comes to a stop it will consider it. I guess you could say that the car moving out of the lane where there is a stopped car is an inexplicable human behavior. But my point is it might be entirely reasonable for an autonomous car to decide to change lanes due to stopped traffic ahead of you.
I don't deny that computers ought to be able to do a better job in most cases than human drivers. But I you said that if we had auto-pilot type systems then we'd have "no problem." Now you're walking that back to "pretty damn well." They will improve safety but they will not eliminate all accidents.
To be clear I'm also not advocating that these systems need to be perfect to be adopted, perfect is the enemy of good. But I think it's important that we have realistic exceptions of the results.