TMC is an independent, primarily volunteer organization that relies on ad revenue to cover its operating costs. Please consider whitelisting TMC on your ad blocker and becoming a Supporting Member. For more info: Support TMC
  1. TMC is currently READ ONLY.
    Click here for more info.

AV makers should be held responsible for accidents, says UK Law Commission

Discussion in 'Autopilot & Autonomous/FSD' started by voyager, Dec 29, 2020.

  1. voyager

    voyager Member

    Joined:
    Apr 28, 2009
    Messages:
    919
    Location:
    Amsterdam, Netherlands
    • Informative x 3
  2. diplomat33

    diplomat33 Well-Known Member

    Joined:
    Aug 3, 2017
    Messages:
    6,832
    Location:
    Terre Haute, IN USA
    That makes sense. It is really just codifying into law what is already implied in the SAE definitions. The SAE says that automated driving systems that are classified as autonomous (L3, L4 or L5) can perform all dynamic driving tasks when they are activated within their respective operational design domains. So if the vehicle is performing all dynamic driving tasks then it follows logically that the vehicle is driving and therefore it should be responsible, not the human. After all, why would the human be responsible when they are not doing any of the driving?
     
    • Informative x 2
  3. run-the-joules

    run-the-joules Active Member

    Joined:
    Aug 13, 2017
    Messages:
    3,528
    Location:
    SF Bay
    Because corporate lobbyists bribe politicians to make the laws shy away from holding them accountable for anything, ever.
     
    • Like x 1
  4. Battpower

    Battpower Supporting Member

    Joined:
    Oct 10, 2019
    Messages:
    1,950
    Location:
    Uk
    This pretty much has to be the case imo. Since I doubt traditional insurance carriers will entertain such risks (at least until there are proven risk profiles) it will pretty much require autonomous vehicle manufacturers to provide insurance at least for driving in autonomous mode.

    Also the current 'unsolicited OTA update' model isn't going to satisfy insurance or regulatory bodies so that will need close scrutiny too.
     
    • Informative x 1
  5. EVNow

    EVNow Well-Known Member

    Joined:
    Sep 5, 2009
    Messages:
    9,240
    Location:
    Seattle, WA
    Yes, thats the SOP.
     
  6. voyager

    voyager Member

    Joined:
    Apr 28, 2009
    Messages:
    919
    Location:
    Amsterdam, Netherlands
    Wouldn't it be easier to have 'electronic markers' switch off ALL automatic driving systems once you pass a city's outer limits inbound, and notify you that you may switch on ADS once you're out on the open road?
     
  7. drtimhill

    drtimhill Active Member

    Joined:
    Apr 25, 2019
    Messages:
    1,533
    Location:
    Seattle
    So we have a brave new world where...

    -- AP systems become safer than human drivers (even L3 will probably get there fast)
    -- ... BUT automakers are held legally liable for accidents when AP is engaged
    -- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
    -- ... AND so more people die in driving accidents with cars driven by bad human drivers

    Wonderful.
     
    • Like x 3
  8. voyager

    voyager Member

    Joined:
    Apr 28, 2009
    Messages:
    919
    Location:
    Amsterdam, Netherlands
    There may come a time that we say: what a clever and verifiable elementary system we had: before people are allowed to drive a car, they need to pass an exam; that makes them primarily responsible and liable when accidents do occur, unless another driver is clearly at fault. That is actually "a Brave Old World". If reducing accidents is the main goal, then start eliminating the DUI ones (if the car smells alcohol/drug use, the car disables itself... or something).
     
  9. Battpower

    Battpower Supporting Member

    Joined:
    Oct 10, 2019
    Messages:
    1,950
    Location:
    Uk
    I have found previous discussions around these issues somewhat contradictory. My interpretation of the discussions would be something like:

    I just want a car that can drive itself. I want to be able to set it to go faster than the posted speed limit 'because everyone drives fast and its dangerous to make others have to pass you'.

    The legal position must be that the speed limit is law, so you must abide by it.

    So then the argument drifts off into countries having different approaches to speed limits, but some people start to give up already and say 'well if the auto driving only goes at the speed limit, I'll never use it'. Others reason that 'because autonomous cars are safer, it will be fine to raise speed limits' - presumably, following this logic we allow higher limits until we are back at an arbitrary level of deaths / injuries that we can live with.

    That seems like a very human determination, unless we reach the point where we are fine abdicating responsibility for such decisions as to 'how much death is it worth so I can drive at whatever speed I want?'

    That question is very problematic because ultimately the objective becomes for the machine to make 'whatever I want' possible.

    We are SO far away from that happening that it isn't worth thinking about. For as long as a human has an input into the design, operation or maintenance of a machine, then ultimately there is a point of responsibility and a risk. Life itself carries risk. Insurance is just a business built around risk, which nowadays has to be a measurable, statistically predictable risk for mass markets. It is also established that there are limits to the insurance carriers obligations, often dictated by points at which blame can be placed elsewhere.

    We already have safe, predictable transport (in theory at least) with public transport, but it is usually inconvenient to a greater or lesser extent. This debate seems to come down to bringing personal mobility closer to being public transport.

    Trouble is, many of us appear to be not so good at sharing or accepting compromise. It's funny that in considering how a machine can completely take over from a human, it ends up being about 'having whatever you want without compromise' which isn't how nature works!

    I actually enjoy taking responsibility for things I have control over. That is where I feel I have some freedom. I want my car to have driver controls and retain the right to use them how I deem safe.
     
    • Informative x 1
  10. banned-66611

    banned-66611 Guest

    This makes a lot of sense. It sounds like they are getting ahead of lawsuits here, because when this kind of accident does happen you can be sure that the driver will want to hold the manufacturer responsible for the costs and blame them to avoid prosecution as well.

    Of course it needs to be backed up by some reasonable rules, e.g. if the car says it is level 3 and you must take over then you need to have a reasonably long time to do that, say 30 seconds, not 3 seconds and if you don't it's your fault.
     
  11. banned-66611

    banned-66611 Guest

    If you want people to use those safer systems then yes, you need to hold the manufacturer liable.

    Imagine you had an L3 system but if it killed someone you would go to jail. Would you trust it? Would you take your eyes off the road as it allows you to?

    If the driver is held liable then nobody would use them anyway. The manufacturer taking responsibility is the only solution.
     
  12. banned-66611

    banned-66611 Guest

    That kind of thing has been suggested before, the problem is creating a viable sensor that can tell if someone is actually above the legal limit reliably.

    There is a reason why those breath testers are only used as the first step, and indicator that needs to be confirmed with a blood test. They just are not that reliable or accurate and it would be a problem if your car wouldn't start because you took some medication that caused a misdiagnosis.

    Also it would be problematic from a practical point of view. The breath testers have disposable parts for you to put your lips around, they are used once and discarded. Creates a lot of waste plastic. Cleaning might be an option but especially these days with COVID...

    And after all that it wouldn't detect things like drug use or tiredness.
     
  13. banned-66611

    banned-66611 Guest

    In practice I think people would use it. They are impatient because all they can do is drive and are just waiting to get to the destination.

    If they can use their phone or watch YouTube or something they won't care so much about an extra minute or two. Just sit back and relax.
     
  14. Battpower

    Battpower Supporting Member

    Joined:
    Oct 10, 2019
    Messages:
    1,950
    Location:
    Uk
    I should perhaps add that I do have FSD on my car, but not in anticipation that the driver controls will be removed anytime soon!

    In fact, while I am absolutely not a Ludite, I would regard it as a very dark day if / when we collectively decide that my mobility is determined by corporations and self driving cars.
     
  15. voyager

    voyager Member

    Joined:
    Apr 28, 2009
    Messages:
    919
    Location:
    Amsterdam, Netherlands
    There has always been a lot of attention for the ease and comfort of having the car drive itself... But what about the fact that people really enjoy themselves driving the car themselves. It is sometimes called the last place where a person can have some privacy and has full control.

    For now, authorities feel pressured to come up with regulations that doesn't smother new developments at the same time...

    [​IMG]
     
    • Like x 1
  16. Battpower

    Battpower Supporting Member

    Joined:
    Oct 10, 2019
    Messages:
    1,950
    Location:
    Uk
    #16 Battpower, Dec 30, 2020
    Last edited: Dec 30, 2020
    In fact, the term 'self driving' is perfect as it embodies all points of view.

    Just depends on if the subject is the driver or the car.

    I think the biggest hurdle will be managing coexistence of human and machine drivers. A different tack that one day might gain traction would be purely functional personal transport devices that are cheap and simple and use extreme geofencing to enable simplified design. Something not much more elaborate than a Renault Twizzy but with an ability to hook into a longer range network. When your 'pod' hooks into the closely controlled long range network, you cease to have control or liability.
     
    • Like x 1
  17. diplomat33

    diplomat33 Well-Known Member

    Joined:
    Aug 3, 2017
    Messages:
    6,832
    Location:
    Terre Haute, IN USA
    #17 diplomat33, Dec 30, 2020
    Last edited: Dec 30, 2020
    Automakers would only be help liable if AP system was an autonomous system. If AP system was a driver assist, the human would continue to be liable for accidents. Also, I assume that this law would only hold AV companies liable for at-fault accidents, not all accidents. So if the AV did not cause the accident, the AV company would not be liable.

    Now I admit that this law could cause some companies to just claim their autonomous driving is really a "driver assist" so as to avoid liability. But if regulators are given the power to determine what is autonomous and what is a driver assist, then that would solve that problem.

    Really, what the law is designed to do is prevent AV companies from putting out a bad product. If an AV company tried to pass off a driver assist system as autonomous or tried to put a bad autonomous system on the road then yeah, they will be in trouble because the car will get into a lot of at-fault accidents and the company will get sued out of business.

    But if the AV company has true autonomous driving system that works great and is a lot of safer than humans, then they have nothing to worry about. The AV will rarely get into any at-fault accidents and the company will be able to afford the liability.

    If a human is a bad driver that causes accidents, they will likely pay higher insurance premiums and maybe even lose their driver's license. Why should it be any different for AVs? If an AV is a bad driver then it should cost the AV company more and if it is really serious, the AV should be removed from public roads. It makes no sense to punish the human if the AV is a bad driver since the human is not driving.
     
    • Like x 3
  18. banned-66611

    banned-66611 Guest

    The manufacturer can be held liable if the driver assist system is unreasonably dangerous. Just putting "you are responsible" on the box doesn't absolve the manufacturer of all responsibility for bad things that happen as a result of poor design or defects.

    At the moment the family of a man killed by someone using autopilot is suing Tesla on this very point, their claim being based on the idea that the "hands on wheel" detection is completely inadequate and does nothing to verify that the driver is actually paying attention, while lulling them into a false sense of security.
     
    • Like x 1
  19. voyager

    voyager Member

    Joined:
    Apr 28, 2009
    Messages:
    919
    Location:
    Amsterdam, Netherlands
    Excellent comments, gentlemen! "To drive or to be driven" is what Shakespeare already said.;)

    I looked up what the IIHS ( Insurance Institute for Highway Safety) has commented lately. Interesting is this one:
    "The Institute’s analysis suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation. To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience".

    https://www.iihs.org/news/detail/self-driving-vehicles-could-struggle-to-eliminate-most-crashes
     
    • Helpful x 1
  20. Battpower

    Battpower Supporting Member

    Joined:
    Oct 10, 2019
    Messages:
    1,950
    Location:
    Uk
    #20 Battpower, Dec 30, 2020
    Last edited: Dec 30, 2020
    Especially given the recent UI issues (edit: does regulation have an obligation to regulate the UI and deem it safe, unambiguous and functional. When is it driver error, driver confusion, driver distraction or UI issue? Fault is already hard enough to establish without the vehicle autonomy aspect. ), here is a good point to consider. Even more so as FSD owners clamour to get their hands on the latest software capabilities. Even now we see flagrant abuse of self driving that obviously contravenes Tesla's instructions and likely puts the errant driver in the wrong. But the debate still exists as to where the line is. Even with current general release FSD, it has city-based functions while arguably not being intended for City use. And that doesn't attempt to deal with 'what's a city?' Does a main freeway type road passing through a city constitute 'city use'? More importantly, can you expect J. Doe to know for sure if the car is in what the manufacturer currently regards as 'city'? In the case of Tesla, I believe that since AP availability turns on and off outside of my control, when the car makes it available then it should be fine for me to use and expect a clearly defined level of safety, regardless of beta, pre-release or whatever.

    Once Real L5 is possible, the debate somewhat disappears, but until then, it is a hugely complex area.
     

Share This Page

  • About Us

    Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.
  • Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


    SUPPORT TMC