Tesla and Volvo disagree on the morals of self driving cars
Tesla's opinion is that the main obstacle with self driving cars are the rules governing the control systems, the ones that make decisions that can mean the difference between life and death. Volvo disagrees.
Peter Carlsson, chief logistics officer at Tesla, was asked about the company's work on self driving cars during the Sthlm Tech Fest conference last week.
- Of coursew we have a team working on it. All new cars would be able to drive without a driver thanks to all the sensors and cameras. But for them to actually do this, of course many other things are required, said Carlsson.
Erik Coelingh, technical specialist at Volvo, disagrees that decisions like these [this is poor writing by the journalist, since they reference something that hasn't been mentioned, except in the title I guess] are the hardest ones.
- Our strategy is for the car to never find itself in a situation where it needs to make these kinds of moral decisions. Ever.
- Because of this the self driving cars will always drive very defensively. They will always keep a following distance to the car in front of them longer than the expected braking distance.
According to Volvo the self driving cars will never go above 70 km/h.
- If, for example, a trailer has stopped along the side of the road making it impossible for the self driving car to see if there is a potential danger ahead, it should change lanes to get a better overview or slow down. It should never take a risk and try to pass in the same lane it's already in, says Coelingh.
But there must be situations that the seld driving car can't anticipate, and there could be technical problems with the car?
- Yes, but they are few. If for example a skydiver suddenly lands in front of the car. That would be an extremely uncommon event.
- In this case the cars should follow the traffic rules, stop in its lane and brake as hard as possible. It should not attempt to make moral decisions. The same goes in a situation with technical problems that couldn't have been anticipated.
- There will never be a table to decide whether the seld driving car should choose to hit a Fiat or another Volvo.
Neither if the car should choose to hit a child or a tractor?
- No the car should do anything in its powers to avoid those kinds of situation. If something extremely uncommon occurs it should still follow the traffic rules, stop in its lane and brake as hard as it can.
Eric Coelingh says that the important moral issue is whether society should keep accepting the 30 000 yearly traffic deaths in Europe, approximately the same number as in the US, when there is technology - seld driving cars - that can radically reduce those numbers.
He compares the issue to vaccinations, used to provent a great deal of human suffering dispite causing a few cases of illness.
- The same moral issue is present with self driving cars. A few new types of deaths could occur as a result of this technology, at the same time that it prevents a much larger number of deadly accidents.
How many of today's 30 000 yearly traffic deaths that could be prevented by seld driving cars is not something Coelingh wants to try to estimate.
- We just don't know. But we do know that almost all accidents are caused by human behaviour. So on paper the potential to reduce the number of accidents is big, but we need to confirm this in reality.
Volvo's triles with "Drive Me" self driving cars in Gothenburg start in 2017. Some time during 2018 or 2019 Coelingh believes he will have good real-life data. He has high expectations:
- If we were only able to reduce the number of traffic deaths from 30 000 to 3000 every year by introducing self driving cars, I'd be dissappointed.