TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker or making a Paypal contribution here: paypal.me/SupportTMC

morals of self driving cars

Discussion in 'Model S' started by Usatyke, Oct 23, 2015.

  1. Usatyke

    Usatyke Member

    Joined:
    Feb 11, 2015
    Messages:
    76
    Location:
    Maltby, WA
  2. Max*

    Max* Autopilot != Autonomous

    Joined:
    Apr 8, 2015
    Messages:
    4,866
    Location:
    NoVa
    Silly article. The car should not be in a situation that force it to decide to kill 10 pedestrians. How the hell did 10 pedestrians get in front of the car? The car will be programmed, in advanced, to avoid these types of situations.

    And I can understand why most people would prefer to be outside of the self-driving car. Fear mongering about new technology -- o no, the self driving car will destroy society, etc.
     
  3. SabrToothSqrl

    SabrToothSqrl Active Member

    Joined:
    Dec 5, 2014
    Messages:
    1,525
    Location:
    PA
  4. Electricfan

    Electricfan Member

    Joined:
    Aug 24, 2013
    Messages:
    971
    Location:
    Houston
    I don't think the Model S can make a decision like this yet - it doesn't recognize things well enough (could be mistaken).

    But future generations of the Model S will. Wonder what Tesla will do? And will they tell the buyers?

    Very interesting article though. I'm not sure I could steer into a wall even to save 10 baby buggies full of babies, if I knew I was going to die. But I can say I'd like my car to do that if the situation ever arose.
     
  5. ItsNotAboutTheMoney

    ItsNotAboutTheMoney Active Member

    Joined:
    Jul 12, 2012
    Messages:
    4,501
    Location:
    Maine
    I wouldn't, at least not until I'm retired. 10 mothers would be a different matter.

    Anyway, ignore the sensationalist headline. An autonomous car has to be programmed to make the best choice, and sometimes the best choice is just the least bad choice. There's no reason why "least bad" has to be totally objective rather than biased to the driver.
     
  6. mikeash

    mikeash Active Member

    Joined:
    Oct 26, 2014
    Messages:
    1,107
    Location:
    Fairfax, VA, USA
    I'm always amused by discussions like these. As if human drivers perform anything like these calculations.

    You know what happens when a human driver is in control of a car faced with an unavoidable accident but with several different outcomes? They pick one on instinct or by luck and that's what happens.

    Certainly it's an interesting hypothetical, but I don't think it has much real-world relevance. In 99.999% of crashes there will be one clear best action.
     
  7. Max*

    Max* Autopilot != Autonomous

    Joined:
    Apr 8, 2015
    Messages:
    4,866
    Location:
    NoVa
    Also, why is imminent death the answer, why can't the car slam into the wall and have the person survive and be injured. Model S is the safest car on the road, autonomous driving might not allow speeding, so chances of surviving a collision are much higher.
     
  8. TEG

    TEG TMC Moderator

    Joined:
    Aug 20, 2006
    Messages:
    17,252
    Location:
    Silicon Valley
    I had been pondering these types of 'moral dilemmas' of self driving cars for a while. I assume there will be debates and laws wrapped around some of the decisions that need to be sorted out.

    The software can't predict everything that might happen, and will sometimes run into a situation where physics won't allow the vehicle to avoid an collision. Lets say a car runs a red light right in front of the self driving car, and the software sees it is about to get hit on the drivers side door by a fast moving car... But if it veers to the side quickly it may avoid having the driver door get hit, but could hit a pedestrian in the crosswalk...

    The 'classic' one I heard about was the deer in the road. Apparently, in many cases, it is safer to hit the deer, then veer on to the shoulder and risk sliding off the road into a ditch or tree. So, people will get upset that self driving cars sometimes 'choose' to hit animals in the road.

    Many drivers, when traffic is heavy, don't maintain a safe driving distance to allow braking under all conditions. I have seen evidence that self driving cars are being programmed to keep a "normal" spacing as to not upset other drivers around them, but there could be a situation where a self driving car could still be unable to stop in time if an accident happened right in front of them. For the sake of argument, lets say the accident up ahead was caused by oil in the road, and the self driving car will not be able to brake well enough on the oil to avoid a collision. So, what does the software know? The value of the two vehicle ahead that it can pick to hit? Which one has better insurance coverage? The software may sometimes be in a position to make "better of two evils" decisions based on the expected cost outcome of the aftermath.

    Back to the thought of "how could a situation like this happen?" Say a child ran into the road chasing a ball that rolled out between two parked cars... And mom ran after them... And both were in the road and the car knew it would end up hitting one of them...
     
  9. Johan

    Johan Took a TSLA bear test. Came back negative.

    Joined:
    Feb 9, 2012
    Messages:
    6,890
    Location:
    Drammen, Norway
    In the very difficult cases, where it's obvious a collision will happen, it will likely be quite far in to the future before the sensor suite+software algos are able to make analysis with high confidence as to (with TEGs last example) whether to hit the child or mother. The difficulties are being sure that there is a child and a mother (with a high level of certainty) and then the next issue is to have the car have a value optimization function to actually make a deliberate choice. IMO this will require human level AI (with lightning fast processing).

    So in reality I think we will see self driving cars being programmed to behave in the most PREDICTABLE way in these types of situations, rather than doing choosing between lesser evils.
     
  10. mikeash

    mikeash Active Member

    Joined:
    Oct 26, 2014
    Messages:
    1,107
    Location:
    Fairfax, VA, USA
    The easy answer would be to allow extremely fast credit database lookups so the car can simply optimize for minimizing the sum of the FICO scores of those killed in the crash.
     
  11. Stoneymonster

    Stoneymonster Active Member

    Joined:
    Jan 8, 2013
    Messages:
    1,554
    Location:
    Aptos, Ca
    That's just a slippery slope to self-driving cars running their own protection rackets!
     
  12. engle

    engle Member

    Joined:
    Aug 25, 2011
    Messages:
    440
    #12 engle, Oct 23, 2015
    Last edited: Oct 23, 2015
    Let's Focus on the "Big Picture": Alcohol-Impaired Driver-Caused Death Reduction

    These statistics are from the CDC and the NHTSA

    http://www.cdc.gov/motorvehiclesafety/impaired_driving/impaired-drv_factsheet.html

    http://www-nrd.nhtsa.dot.gov/Pubs/812102.pdf

    "In 2013, 10,076 people were killed in alcohol-impaired driving crashes ... 31% of all traffic-related deaths in the United States." There were about 32,500 total deaths in traffic-related accidents. The economic cost of the alcohol-impaired-driving crashes in the US in 2010 was $49.8 billion, so cost of all accidents that year was about $160B.

    Let's assume by the year 2016 + X years, 50% of all vehicles on the road are self-driving. They won't be "perfect" drivers, but far better than we humans. If they were "perfect", then about 16,250 human lives, and $80B in economic cost from "lost productivity, workplace losses, legal and court expenses, medical costs, EMS, insurance administration, congestion, and property damage" would be saved. Finally, let's conservatively assume these vehicles cost the lives of 1,000 people because of defects in their algorithms. Based on Google's autonomous vehicle safety track record, I think 1,000 is WAY too high, but I'm using it to make a point. Now "only" 15,250 NET lives would be saved, and "only" $75B in NET economic cost.

    I think, even in this highly exaggerated case, it is worth the loss of 1,000 people to save 16,250 others.

    IMHO, the MEDIA will over-publicize EVERY case of an accidental death involving a self-driving vehicle. Here is why:

    1. Attracts higher ratings and more eyeballs from the general population.

    2. Just as we've seen with BEV vs. ICE, there are MANY entrenched interests that will LOSE financially when self-driving vehicles disrupt the road transportation industry:

    • Lawyers
    • Doctors
    • Hospitals
    • Auto Insurance companies
    • EMS Companies
    All of these (with possible exception of EMS) have very powerful lobbyists in Washington. Most of our congresspeople ARE LAWYERS!!! IMHO, we need more non-attorneys in Congress.

    3. These entrenched financial interests will want to see the adoption rate of inexpensive self-driving cars be a low as possible so they can continue to financially benefit from the loss of approx. 32,500 lives -- just in the USA -- each year due to automobile accidents.

    The economic cost above is just the tangible losses that result from motor vehicle crashes. "When quality of life valuations are considered, the total value of societal harm from motor vehicle crashes in the US in 2010 was an estimated $870.8 billion, of which $206.9 billion resulted from alcohol-impaired-driving crashes."

    4. The alcohol industry will be a beneficiary of self-driving car adoption since there will be no need for a "designated driver".

    5. I predict that once self-driving car market penetration approaches 90-95% in the US, it will become much more difficult to obtain a driver's license. Insurance rates for "self-drivers" will also skyrocket.

    6. I expect someday I'll be telling my grandchildren:

    "You know, cars used to have a wheel and two pedals. One was called the accelerator to make the car go faster. The other one was the brake. I had to sit in the front seat and turn the wheel to make the car go where I wanted it to. It was actually a lot of fun!" :biggrin:
     
  13. Electricfan

    Electricfan Member

    Joined:
    Aug 24, 2013
    Messages:
    971
    Location:
    Houston
    Sorry, what do you mean by "predictable"? In the child v mother example for instance?
     
  14. Johan

    Johan Took a TSLA bear test. Came back negative.

    Joined:
    Feb 9, 2012
    Messages:
    6,890
    Location:
    Drammen, Norway
    #14 Johan, Oct 23, 2015
    Last edited: Oct 23, 2015
    Not sure. I think it will turn out to be a sort of conformity, perhaps mimicking typical human behavior (swerving rather than keep going straight).

    When there's no obvious right or wrong, but two choices, perhaps at least it would be good for at least predictably. It's often better and more comforting than seemingly random behavior.
     
  15. SabrToothSqrl

    SabrToothSqrl Active Member

    Joined:
    Dec 5, 2014
    Messages:
    1,525
    Location:
    PA

    OMG that's funny.
     
  16. DougJohnson

    DougJohnson Member

    Joined:
    Apr 2, 2015
    Messages:
    16
    Location:
    Dallas, Texas
    Of course the car should not be in that situation. Neither should human drivers. But exactly what sensor set is going to detect 10 stupid people crossing a highway behind a blind hill? -- Doug
     
  17. musicious

    musicious Member

    Joined:
    Apr 11, 2015
    Messages:
    223
    Location:
    Wheeling, IL
    We wouldn't die from crashing into a wall cause of the large crumple zone on the Tesla! So this discussion is a moot point, let the car hit the wall and insurance can buy the newest model for us :)
     
  18. Max*

    Max* Autopilot != Autonomous

    Joined:
    Apr 8, 2015
    Messages:
    4,866
    Location:
    NoVa
    x-ray?
     
  19. Johan

    Johan Took a TSLA bear test. Came back negative.

    Joined:
    Feb 9, 2012
    Messages:
    6,890
    Location:
    Drammen, Norway
    With the sensor where exactly? (Clue: high energy EM radiation goes through objects/doesn't bounce, so a sensor beyond the object would be required).
     
  20. Max*

    Max* Autopilot != Autonomous

    Joined:
    Apr 8, 2015
    Messages:
    4,866
    Location:
    NoVa
    I wasn't being serious, I don't have an answer to that question. Do I think it can be done? Yes. How? I don't know, but I think there's a clever solution out there somewhere.
     

Share This Page