Indeed. In fact one must question if the gearcruncher is just trolling ....
I'm sorry that I have communicated so poorly that people think I might be trolling. I can assure you I am not. I am seriously interested in getting us to autonomous driving as a society- but I was trying to point out the challenges that lie between here and there, because unless you acknowledge and solve those, you'll never succeed. Apparently I failed miserably in trying to make that point. Like I said in the beginning, I do believe these issues are solve-able, and in fact they are "easy" when 99% of cars are self driving. The interesting questions are how we get to even 1% of cars self driving when there are many vested interests and laws against that. We can't get to 99% without figuring out the 1% first.
I work in a non-automotive autonomous vehicle space, and the questions I brought up are exactly the ones that our industry is working through. Ambiguous, poorly written laws that require interpretation to meet. We have the advantage of having both a single federal set of laws, as well as technical experts on the regulatory side that we can work with to negotiate acceptable solutions *before* we release the product, and it still takes years to negotiate/prove you met the intent of what appear to be simple rules. Piles of industry standards trying to navigate the rules and produce quantitative guidance that still can lead to reasonable questions if they are "sufficient". There are laws in my industry that mean when humans operate the vehicles, they are likely in violation of some law a large percentage of the time, but those laws only get applied when something goes wrong. I see all the same issue coming with autonomous cars 10 fold.
You can say "It's easy to program a car to stay 147 feet behind the car in front, and that's "reasonable and prudent". Technically, that is easy. The issue is if the regulator will agree that 147 feet is reasonable and prudent. How did you come up with 147 feet? Why not 159? How many nines of reliability is that modeled off of? Who is liable when that turns our to be a wrong decision 1 in 1 billion miles? Will every local judge agree that was reasonable and prudent? All reasonable questions that take time to answer. You can get there, but with the great American experiment of 50 states with individual counties and cities, this is much harder than things regulated only at the federal level. I'm really interested to see how America solves it.
You can also say "once we have this solved, this will be so much safer!" - of course it will. But for the next 30 years, self driving and human driven cars are going to exist on the same roads and will interact with one another. Lots of work in the autonomous space will be wasted work eventually as it will have no use once all vehicles are autonomous. But we can't snap our fingers and just make human driven cars illegal, so we have to solve these issues to get there, and the earlier we realize they are going to be challenges, the faster the whole transition goes. For me, the highest level question that is interesting is "Are autonomous cars ever allowed to break the written law?" and if the answer is no, then the thought experiment is figuring out where that won't actually work in the real world, because autonomous cars are sometimes going to be working with different assumptions/rules/mental models than humans are.
Anyway, I get the point that I'm not adding to the conversation in a useful way, and that most discussions of challenges faced by autonomous cars are interpreted as someone that hates autonomy in general and believes it will never work. I'll move to the sidelines and let the autonomous car pass