I'm really excited about auto-pilot and driverless cars. I'm a software engineer, and a technologist. I'm an early adopter, and I'm really optimistic about technology. All of that said, I don't think we'll see self driving cars anytime soon. It has nothing to do with the technology; it's just society and policy.
I don't think we'll ever solve the
Trolley Problem. Imagine a scenario where your self driving car detects a 5 year old girl running across the street. Your car has enough data and time to make a choice: run over the girl, killing her, or smash into a barricade, killing the occupants of the car. How should we program the car in this case, and in all of the hundreds of cases like it? How do we determine the relative value of people's lives?
What if there are four people in the car? Does that change the answer?
What if the person crossing the street is a 94 year-old homeless man?
When we make snap decisions as humans, the outcome can be called tragic, unfortunate, and "an accident".
When we program our cars to decide in these situations, it ceases to be "accidental". Now, it's calculated, reviewable, and intentional.
I wouldn't want to be the guy to explain to a mother that her child was killed because the car calculated that her life was less valuable than the people in the car.