Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AV makers should be held responsible for accidents, says UK Law Commission

This site may earn commission on affiliate links.
"Under a new proposal from the UK‘s Law Commission the driver of the vehicle would NOT be prosecuted when the car was in self-driving mode(...) Instead responsibility should fall on the developer or manufacturer of the hardware that enables self-driving."

Autonomous vehicle makers should be held responsible for accidents, says Law Commission

That makes sense. It is really just codifying into law what is already implied in the SAE definitions. The SAE says that automated driving systems that are classified as autonomous (L3, L4 or L5) can perform all dynamic driving tasks when they are activated within their respective operational design domains. So if the vehicle is performing all dynamic driving tasks then it follows logically that the vehicle is driving and therefore it should be responsible, not the human. After all, why would the human be responsible when they are not doing any of the driving?
 
"Under a new proposal from the UK‘s Law Commission the driver of the vehicle would NOT be prosecuted when the car was in self-driving mode(...) Instead responsibility should fall on the developer or manufacturer of the hardware that enables self-driving."

Autonomous vehicle makers should be held responsible for accidents, says Law Commission

This pretty much has to be the case imo. Since I doubt traditional insurance carriers will entertain such risks (at least until there are proven risk profiles) it will pretty much require autonomous vehicle manufacturers to provide insurance at least for driving in autonomous mode.

Also the current 'unsolicited OTA update' model isn't going to satisfy insurance or regulatory bodies so that will need close scrutiny too.
 
  • Informative
Reactions: voyager
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.
 
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.

There may come a time that we say: what a clever and verifiable elementary system we had: before people are allowed to drive a car, they need to pass an exam; that makes them primarily responsible and liable when accidents do occur, unless another driver is clearly at fault. That is actually "a Brave Old World". If reducing accidents is the main goal, then start eliminating the DUI ones (if the car smells alcohol/drug use, the car disables itself... or something).
 
I have found previous discussions around these issues somewhat contradictory. My interpretation of the discussions would be something like:

I just want a car that can drive itself. I want to be able to set it to go faster than the posted speed limit 'because everyone drives fast and its dangerous to make others have to pass you'.

The legal position must be that the speed limit is law, so you must abide by it.

So then the argument drifts off into countries having different approaches to speed limits, but some people start to give up already and say 'well if the auto driving only goes at the speed limit, I'll never use it'. Others reason that 'because autonomous cars are safer, it will be fine to raise speed limits' - presumably, following this logic we allow higher limits until we are back at an arbitrary level of deaths / injuries that we can live with.

That seems like a very human determination, unless we reach the point where we are fine abdicating responsibility for such decisions as to 'how much death is it worth so I can drive at whatever speed I want?'

That question is very problematic because ultimately the objective becomes for the machine to make 'whatever I want' possible.

We are SO far away from that happening that it isn't worth thinking about. For as long as a human has an input into the design, operation or maintenance of a machine, then ultimately there is a point of responsibility and a risk. Life itself carries risk. Insurance is just a business built around risk, which nowadays has to be a measurable, statistically predictable risk for mass markets. It is also established that there are limits to the insurance carriers obligations, often dictated by points at which blame can be placed elsewhere.

We already have safe, predictable transport (in theory at least) with public transport, but it is usually inconvenient to a greater or lesser extent. This debate seems to come down to bringing personal mobility closer to being public transport.

Trouble is, many of us appear to be not so good at sharing or accepting compromise. It's funny that in considering how a machine can completely take over from a human, it ends up being about 'having whatever you want without compromise' which isn't how nature works!

I actually enjoy taking responsibility for things I have control over. That is where I feel I have some freedom. I want my car to have driver controls and retain the right to use them how I deem safe.
 
  • Informative
Reactions: voyager
This makes a lot of sense. It sounds like they are getting ahead of lawsuits here, because when this kind of accident does happen you can be sure that the driver will want to hold the manufacturer responsible for the costs and blame them to avoid prosecution as well.

Of course it needs to be backed up by some reasonable rules, e.g. if the car says it is level 3 and you must take over then you need to have a reasonably long time to do that, say 30 seconds, not 3 seconds and if you don't it's your fault.
 
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.

If you want people to use those safer systems then yes, you need to hold the manufacturer liable.

Imagine you had an L3 system but if it killed someone you would go to jail. Would you trust it? Would you take your eyes off the road as it allows you to?

If the driver is held liable then nobody would use them anyway. The manufacturer taking responsibility is the only solution.
 
There may come a time that we say: what a clever and verifiable elementary system we had: before people are allowed to drive a car, they need to pass an exam; that makes them primarily responsible and liable when accidents do occur, unless another driver is clearly at fault. That is actually "a Brave Old World". If reducing accidents is the main goal, then start eliminating the DUI ones (if the car smells alcohol/drug use, the car disables itself... or something).

That kind of thing has been suggested before, the problem is creating a viable sensor that can tell if someone is actually above the legal limit reliably.

There is a reason why those breath testers are only used as the first step, and indicator that needs to be confirmed with a blood test. They just are not that reliable or accurate and it would be a problem if your car wouldn't start because you took some medication that caused a misdiagnosis.

Also it would be problematic from a practical point of view. The breath testers have disposable parts for you to put your lips around, they are used once and discarded. Creates a lot of waste plastic. Cleaning might be an option but especially these days with COVID...

And after all that it wouldn't detect things like drug use or tiredness.
 
So then the argument drifts off into countries having different approaches to speed limits, but some people start to give up already and say 'well if the auto driving only goes at the speed limit, I'll never use it'. Others reason that 'because autonomous cars are safer, it will be fine to raise speed limits' - presumably, following this logic we allow higher limits until we are back at an arbitrary level of deaths / injuries that we can live with.

In practice I think people would use it. They are impatient because all they can do is drive and are just waiting to get to the destination.

If they can use their phone or watch YouTube or something they won't care so much about an extra minute or two. Just sit back and relax.
 
I should perhaps add that I do have FSD on my car, but not in anticipation that the driver controls will be removed anytime soon!

In fact, while I am absolutely not a Ludite, I would regard it as a very dark day if / when we collectively decide that my mobility is determined by corporations and self driving cars.
 
I should perhaps add that I do have FSD on my car, but not in anticipation that the driver controls will be removed anytime soon!

In fact, while I am absolutely not a Ludite, I would regard it as a very dark day if / when we collectively decide that my mobility is determined by corporations and self driving cars.

There has always been a lot of attention for the ease and comfort of having the car drive itself... But what about the fact that people really enjoy themselves driving the car themselves. It is sometimes called the last place where a person can have some privacy and has full control.

For now, authorities feel pressured to come up with regulations that doesn't smother new developments at the same time...

_1%2Bads%2Bcartoon%2B%25281%2529.jpg
 
  • Like
Reactions: Battpower
There has always been a lot of attention for the ease and comfort of having the car drive itself... But what about the fact that people really enjoy themselves driving the car themselves. It is sometimes called the last place where a person can have some privacy and has full control.

For now, authorities feel pressured to come up with regulations that doesn't smother new developments at the same time...

_1%2Bads%2Bcartoon%2B%25281%2529.jpg

In fact, the term 'self driving' is perfect as it embodies all points of view.

Just depends on if the subject is the driver or the car.

I think the biggest hurdle will be managing coexistence of human and machine drivers. A different tack that one day might gain traction would be purely functional personal transport devices that are cheap and simple and use extreme geofencing to enable simplified design. Something not much more elaborate than a Renault Twizzy but with an ability to hook into a longer range network. When your 'pod' hooks into the closely controlled long range network, you cease to have control or liability.
 
Last edited:
  • Like
Reactions: voyager
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.

Automakers would only be help liable if AP system was an autonomous system. If AP system was a driver assist, the human would continue to be liable for accidents. Also, I assume that this law would only hold AV companies liable for at-fault accidents, not all accidents. So if the AV did not cause the accident, the AV company would not be liable.

Now I admit that this law could cause some companies to just claim their autonomous driving is really a "driver assist" so as to avoid liability. But if regulators are given the power to determine what is autonomous and what is a driver assist, then that would solve that problem.

Really, what the law is designed to do is prevent AV companies from putting out a bad product. If an AV company tried to pass off a driver assist system as autonomous or tried to put a bad autonomous system on the road then yeah, they will be in trouble because the car will get into a lot of at-fault accidents and the company will get sued out of business.

But if the AV company has true autonomous driving system that works great and is a lot of safer than humans, then they have nothing to worry about. The AV will rarely get into any at-fault accidents and the company will be able to afford the liability.

If a human is a bad driver that causes accidents, they will likely pay higher insurance premiums and maybe even lose their driver's license. Why should it be any different for AVs? If an AV is a bad driver then it should cost the AV company more and if it is really serious, the AV should be removed from public roads. It makes no sense to punish the human if the AV is a bad driver since the human is not driving.
 
Last edited:
Automakers would only be help liable if AP system was an autonomous system. If AP system was a driver assist, the human would continue to be liable for accidents.

The manufacturer can be held liable if the driver assist system is unreasonably dangerous. Just putting "you are responsible" on the box doesn't absolve the manufacturer of all responsibility for bad things that happen as a result of poor design or defects.

At the moment the family of a man killed by someone using autopilot is suing Tesla on this very point, their claim being based on the idea that the "hands on wheel" detection is completely inadequate and does nothing to verify that the driver is actually paying attention, while lulling them into a false sense of security.
 
  • Like
Reactions: Dan D.
Excellent comments, gentlemen! "To drive or to be driven" is what Shakespeare already said.;)

I looked up what the IIHS ( Insurance Institute for Highway Safety) has commented lately. Interesting is this one:
"The Institute’s analysis suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation. To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience".

https://www.iihs.org/news/detail/self-driving-vehicles-could-struggle-to-eliminate-most-crashes
 
  • Helpful
Reactions: diplomat33
If a human is a bad driver that causes accidents, they will likely pay higher insurance premiums and maybe even lose their driver's license.

Especially given the recent UI issues (edit: does regulation have an obligation to regulate the UI and deem it safe, unambiguous and functional. When is it driver error, driver confusion, driver distraction or UI issue? Fault is already hard enough to establish without the vehicle autonomy aspect. ), here is a good point to consider. Even more so as FSD owners clamour to get their hands on the latest software capabilities. Even now we see flagrant abuse of self driving that obviously contravenes Tesla's instructions and likely puts the errant driver in the wrong. But the debate still exists as to where the line is. Even with current general release FSD, it has city-based functions while arguably not being intended for City use. And that doesn't attempt to deal with 'what's a city?' Does a main freeway type road passing through a city constitute 'city use'? More importantly, can you expect J. Doe to know for sure if the car is in what the manufacturer currently regards as 'city'? In the case of Tesla, I believe that since AP availability turns on and off outside of my control, when the car makes it available then it should be fine for me to use and expect a clearly defined level of safety, regardless of beta, pre-release or whatever.

Once Real L5 is possible, the debate somewhat disappears, but until then, it is a hugely complex area.
 
Last edited: