Safety driver distraction is a really significant problem whilst we have this sort of half way house autonomy. Older aeroplanes, with autopilot but without a full FMS, have exactly the same problem. The solution with aircraft was to increase the capability of the autopilot systems so that instead of just flying at a set height and course, they had enough situational awareness to be able to fly the aeroplane safely as well as, perhaps better than, a human, and then to set stringent rules as to where the aircraft can operate autonomously and where it cannot.
The situation we have with cars is that there are no such stringent rules being enforced. Some are choosing to allow cars to drive in conditions that are outside the scope and capability of the autonomous control system, and, at the moment, the cars allow this. It may be that, in order to get full self driving working safely initially, where self driving vehicles are very much in the minority, there needs to be some sort of geofencing enforced. Only allowing autonomous driving on motorways and dual carriageways, for example, might be a reasonable start, with the system being disabled in towns and along narrow lanes. As the system matures, then the geofencing could be relaxed.
"Safety driver" awareness is a massive challenge as we humans are not very good at remaining focused, especially where a system is operating and we are expected to step in at short notice. It's one of the reasons why large commercial aircraft have to have a Pilot-In-Charge and Co-Pilot, as this system allows for not only load-sharing but also monitoring of the other's actions.
however,, in the unfortunate case of Elaine Herzberg, as in so many disasters, there were many avoidable steps into the incident pit.
Uber rolled out a system too quickly- they had lots of reports internally to this effect. The unfortunate victim decided to cross a multi-lane road at night despite signs stating that crossing was prohibited.She also tested positive for dope and meth, though it's not possible to say she was under the influence at the time. The "safety driver" chose to allow herself to be distracted by watching TV on her phone at the start of her shift. Uber clearly had little or no monitoring of its safety drivers in place.
Although this case will be held up as a case against autonomous vehicles it really is a poor one to make judgements on, because of the above.
Safety driver distraction is a really significant problem whilst we have this sort of half way house autonomy. Older aeroplanes, with autopilot but without a full FMS, have exactly the same problem. The solution with aircraft was to increase the capability of the autopilot systems so that instead of just flying at a set height and course, they had enough situational awareness to be able to fly the aeroplane safely as well as, perhaps better than, a human, and then to set stringent rules as to where the aircraft can operate autonomously and where it cannot.
Perfection can't be the enemy of progress. Planes used autopilot for years and incrementally became better at it.
The situation we have with cars is that there are no such stringent rules being enforced. Some are choosing to allow cars to drive in conditions that are outside the scope and capability of the autonomous control system, and, at the moment, the cars allow this. It may be that, in order to get full self driving working safely initially, where self driving vehicles are very much in the minority, there needs to be some sort of geofencing enforced. Only allowing autonomous driving on motorways and dual carriageways, for example, might be a reasonable start, with the system being disabled in towns and along narrow lanes. As the system matures, then the geofencing could be relaxed.
As with many things, legislation will also be slower than the technology advances. Waymo are following the strictly geo-fenced model. GM, with their Super Cruise model, are following the type of road model where the system (until next year) will only work on interstates and freeways. Which pathway will be best is unknown at this time.
Ultimately, FSD is an engineering problem. There's nothing in the laws of physics that is a show-stopper so it's solution lies purely in engineering and as such it will be solved. At the moment automotive deaths kill 1.35 MILLION people per year, but as usual we humans are very poor at judging risk. In reality we should be screaming from the rooftops about the 1.35 million killed every year and doing everything that we can to prevent meatbags like ourselves being allowed to continue that slaughter.