Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

If 90% crash reduction goal achieved - will laws require self driving?

This site may earn commission on affiliate links.
This is how I think it will go down as opposed to a government mandate. Once the technology is such that it is clear autonomous driving decreases the chance of a collision, injury etc and thus decreases the average cost of insurers paying claims, the cost of insurance should decrease. Specifically, you would pay a lower premium if you have an autonomous vehicle just like you pay a lower premium if you have ABS, airbags or an accelerometer Drive Safe & Save™ Mobile App – State Farm®

Sure you can still drive your own car if you want your insurance rates to be double. Auto insurance is ridiculously expensive due to the frequency of collisions caused by human error. A 90 % reduction in accidents could substantially lower insurance rates making autonomous driving technology really pay for itself quickly.
I absolutely agree that this will be the stop-gap measure.

You drive your car yourself? That'll be $5,000 for 6 months. You have an autonomous car? That'll be $100 for 6 months.
 
. i have a hard time believing laws forbidding human intervention would be popular enough
Hit them where it hurts - money.

You want to insure a non-self driving car? $$$$$$$
You don't want to carry insurance at all? $$$$$$$$$$$$$$$$$$$$$$$$$$

You just passed the law that autonomous cars should be used in autonomous mode, without ever stating those words.
 
  • Like
Reactions: pilotSteve
Might it be that insurance companies have a lot more say about it than the government? And, besides, I remember the debacle concerning the flame retardant for children's clothing, (sorry, the names just not coming to me). It mandated that a particular flame retardant be used in all children's clothing. Until it was discovered that it was an extremely high carcinogen. Then the Mandate was, no surprise, no clothes can have that! The problem with government involvement is it tends to be based on reelection possibilities.

Very incisive! Once it is established that limited "auto-pilot" features actually reduce accidents, insurance companies will adjust their rates accordingly. When there are cars capable of intervening and taking control away from the driver to prevent accidents, and it is established that these also reduce accidents, insurance companies will adjust their rates accordingly. And once there are fully-autonomous cars being sold to the public, insurance rates will again be adjusted. At each stage, you can choose to drive a safer car, or you can choose to pay higher rates than the computerized cars.

I will never set foot in a vehicle that does not allow emergency intervention 100% of the time.

I, too, would refuse to ride in a car that is not capable of taking control away from the driver in an emergency, once such cars become available.

Altogether too many people are looking at the whole issue backwards: They think of all the situations when (in their usually-mistaken view) they could prevent an accident better than a computer could. The correct way to view the issue is: How many people die because of human drivers, vs. how many people would die if computers drove our cars. There are roughly 32,000 traffic deaths a year at present. If autonomous or partially-autonomous cars eliminated 30,000 of those deaths and caused 2,000 other deaths, that's 28,000 lives saved. But few people are capable of correctly assessing risk, and fewer still act on such a correct assessment. People want to feel in control, and altogether too many people feel that 32,000 deaths a year in the U.S. alone is an acceptable price to pay in order to remain in control of their car. I have the same emotional reaction; I just don't let it govern my choices. I am more frightened to ride as a passenger in a car with a driver who is clearly a better driver than I am, than I am of driving my own car. But I know that I am safer with that person driving. I am scared to fly, but I fly anyway, because I know it's the safest way to get where I want to go.

I agree that U.S. lawmakers tend to oppose regulations until those have been very clearly demonstrated to save lives, and that they are unlikely to mandate the retirement of older cars, but it would be the logical thing to do, or at least outlaw the manufacture of new non-autonomous cars once the technology is established. But what government won't do, insurance companies will not hesitate to do: Charge according to the actuarial tables.
 
Once cars from Tesla (for example) avoid xx% of fatal accidents (for example) then there will be lawsuits against the other car manufacturers for just about every fatality that occurs in their vehicles. Quite possibly it will be quickly become prohibitively risky-expensive to sell a car without the safety/autonomous features.
 
My problem with that line of reasoning is that many of those 32,000 deaths were caused by stupid people doing stupid things. I'm not a stupid driver so my risk of accident is much lower so I will always want to feel in control of the vehicle in the event that the computer fails to recognize a hazard. I do not want to be one of the 2,000 drivers who lost a life because the computer made an error that I otherwise would not have made.

Overall, risk will likely dramatically decrease - I don't disagree with that. But I do dispute the notion that just because the risk of computer caused accident is so low that we must accept it as a fact of life and worth the risk. I just cant accept that argument.
 
My problem with that line of reasoning is that many of those 32,000 deaths were caused by stupid people doing stupid things. I'm not a stupid driver so my risk of accident is much lower so I will always want to feel in control of the vehicle in the event that the computer fails to recognize a hazard. I do not want to be one of the 2,000 drivers who lost a life because the computer made an error that I otherwise would not have made.

chances are you are not special. everyone thinks they're the exception -- 86% of drivers believe they are above average. only 13% believe they are average, and less than 1% believe they are below average or poor at driving.

Driver Report Card: Hankook Tire Reveals How Americans Grade Themselves as Drivers

it is illogical to accept at face value your assertion that you're a better driver than typical drivers, especially given the data that 86% of drivers believe the same thing. but logic has little to do with it. people possessing the same self-evaluation as yours are very much in the majority, which will be the primary obstacle to implementing laws mandating autonomous operation.
 
The problem is that of those 32,000 people who die each year, what percentage are actually good/undistracted drivers? All it takes is for someone to run an intersection texting and ram into you at 60mph. Or some teen who thinks he can make the light, and swerves in and out of traffic, etc.

The other issue is that you might be the best driver in the world, but some idiot going 50mph through a residential 25mph neighborhood while you're crossing the street, and your chance of survival is ~25%. Whereas if he were in an autonomous car going 25mph, your chance of survival upon impact is ~85%.

You don't have to be the cause of the accident, to end up a statistic.
 
I think we will be required to have automated driving when a cybernetic organism from the future descends in a ball of lightning and says, "Hello you fine folk. I'm from the future, and I can assure you, that we will do just fine with automated vehicles. By the way, do any of you know where John Connor is? He is supposed to help me design the world's safest machinery."
 
  • Funny
Reactions: calisnow
The problem is that of those 32,000 people who die each year, what percentage are actually good/undistracted drivers? All it takes is for someone to run an intersection texting and ram into you at 60mph. Or some teen who thinks he can make the light, and swerves in and out of traffic, etc.

^This is what I'm taking about. People texting when it is unsafe to do so. People driving 80 mph and swerving out of control on a rainy day. Truck drivers that swerve out to overtake another big rig (happens all the time on the 5). These are objectively stupid drivers. I'm not one of them.

chances are you are not special. everyone thinks they're the exception -- 86% of drivers believe they are above average. only 13% believe they are average, and less than 1% believe they are below average or poor at driving.

Driver Report Card: Hankook Tire Reveals How Americans Grade Themselves as Drivers

it is illogical to accept at face value your assertion that you're a better driver than typical drivers, especially given the data that 86% of drivers believe the same thing. but logic has little to do with it. people possessing the same self-evaluation as yours are very much in the majority, which will be the primary obstacle to implementing laws mandating autonomous operation.

And chances are that many of those 86% are in fact good, responsible drivers. I strongly suspect most accidents are caused by the remaining 14%. We all make mistakes, but there is an obvious distinction between recklessness and rare oversight.

The other issue is that you might be the best driver in the world, but some idiot going 50mph through a residential 25mph neighborhood while you're crossing the street, and your chance of survival is ~25%. Whereas if he were in an autonomous car going 25mph, your chance of survival upon impact is ~85%.

You don't have to be the cause of the accident, to end up a statistic.

The car does not need full autonomy. Collision avoidance and AEB is all that is needed (and enabled by default in all Tesla's). I think this was an excellent decision by Tesla.
 
Last edited:
I think cost of insurance will go up fast until it becomes unaffordable for most people to drive the car themselves. Insurance companies have to charge high premiums because they will not be able to earn much from self driving cars with low accident rates.
 
I think cost of insurance will go up fast until it becomes unaffordable for most people to drive the car themselves. Insurance companies have to charge high premiums because they will not be able to earn much from self driving cars with low accident rates.

Maybe Tesla could offer to insure their owners' cars for a reasonable cost that reflects the decreased risk with their new technology without gouging the consumer. After all, Tesla will have a large part of the exposure when the computer does actually cause a crash sp why not just build that in to the price of the car?
 
Once cars from Tesla (for example) avoid xx% of fatal accidents (for example) then there will be lawsuits against the other car manufacturers for just about every fatality that occurs in their vehicles. Quite possibly it will be quickly become prohibitively risky-expensive to sell a car without the safety/autonomous features.

I think that such lawsuits would be thrown out of court unless and until a significant portion of the fleet has turned over, after it has become economically feasible for all automakers to include the technology. No court is going to say that an automaker is liable for not having autonomous technology in a car built before such technology was widely available. If Tesla develops this technology in five years and makes it freely available to all other automakers, it will still be another decade before the majority of older cars have been retired from use.

A much more likely scenario is that it will become so much cheaper to insure autonomous cars and/or cars with automated crash avoidance, that there will be serious financial pressure to switch away from the older cars.

My problem with that line of reasoning is that many of those 32,000 deaths were caused by stupid people doing stupid things. I'm not a stupid driver so my risk of accident is much lower so I will always want to feel in control of the vehicle in the event that the computer fails to recognize a hazard. I do not want to be one of the 2,000 drivers who lost a life because the computer made an error that I otherwise would not have made.

Part of the problem is that virtually everyone thinks they are a flawless driver. But everyone gets distracted from time to time. Everyone makes mistakes. And nobody has the reaction time of a computer. I don't care how good a driver you are, a computer will be better.

Overall, risk will likely dramatically decrease - I don't disagree with that. But I do dispute the notion that just because the risk of computer caused accident is so low that we must accept it as a fact of life and worth the risk. I just cant accept that argument.

It sounds as though you're saying that 32,000 deaths a year is better than 4,000. But I'm sure you don't mean that, so I must be misunderstanding your point. Even the best driver will be safer in an autonomous car, because the car is simultaneously tracking the position and velocity of every car in the vicinity, and can react in less than an eyeblink.

You say you don't want to accept the risk of 4,000 deaths a year, but you are accepting the risk of 32,000. That makes no sense. Nothing is perfect. There will always be traffic deaths as long as we have cars. Computers will make fewer mistakes than even the very best of drivers, therefore they are preferable. And once all cars are autonomous and communicating with each other, the number of deaths is likely to be closer to 100 than 4,000, and many of those will be from strokes and heart attacks unrelated to the car.
 
@daniel - Correct, you did not understand my point. For the good of the many has never been an argument that has gone of well with me. I still care about the few. Computers make mistakes. They are not perfect and never will be. All I'm asking for is a manual override. I'm not trying to stop autonomy.
 
@daniel - Correct, you did not understand my point. For the good of the many has never been an argument that has gone of well with me. I still care about the few. Computers make mistakes. They are not perfect and never will be. All I'm asking for is a manual override. I'm not trying to stop autonomy.

You care about the few. So 32,000 deaths per year is better than 4,000??? I'm flummoxed here. You want a manual over-ride. Do you mean you want to be able to prevent the car from taking over when it senses an emergency situation? Or do you merely mean you want to do the driving unless and until the car senses an emergency? If you mean the latter, I don't have much problem with that. But if you mean you want to be able to prevent the car from taking over to prevent a crash, then you've defeated the whole purpose.

If you mean that the right of the few to remain completely in control of their car even when the car could prevent an accident is more important than saving 28,000 lives a year, then I'm in complete disagreement. A car is a deadly weapon, and no human driver will be as good as a computer at avoiding/preventing crashes. Once we have technology that results in fewer deaths than good human drivers, then refusing to use such technology is the moral equivalent of attempted murder.

Again, if all you're saying is that you want to be in control until the car recognizes an emergency situation, I've got no big issue. But if you want to be able to completely disable autopilot so that it cannot respond to an emergency, than I hope you're not driving in the same city as I am.
 
  • Informative
Reactions: pilotSteve