Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

AV makers should be held responsible for accidents, says UK Law Commission

This site may earn commission on affiliate links.
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.
Did you know but on Tesla Insurance you have a ligne that this insurance support SelfDriving accident
 
  • Love
Reactions: Lubestaff
I looked up what the IIHS ( Insurance Institute for Highway Safety) has commented lately. Interesting is this one:
"The Institute’s analysis suggests that only about a third of those crashes were the result of mistakes that automated vehicles would be expected to avoid simply because they have more accurate perception than human drivers and aren’t vulnerable to incapacitation. To avoid the other two-thirds, they would need to be specifically programmed to prioritize safety over speed and convenience".

Wouldn't that be the default choice for most manufacturers? I can't see many of them wanting to increase their own liability.

Or will we get manufacturer specific choices like BMWs that never indicate and always do 20 kph over the limit?
 
  • Funny
Reactions: HelixSpiral
I use to work for a UK institution that use to work with various bodies, including government and insurance companies, on researching future transportation direction and policy. Whilst I never worked in the areas being discussed, internal knowledge transfer and regular lunchtime presentations meant that everyone had a fair idea what things were currently being worked on and what was trending.

So, several years out of date, but a sufficiently intrenched idea within (at least the European) the transport industry that it probably still is the thinking, that truly autonomous vehicles would no longer be treated as a traditional vehicle as far as insurance and liability was concerned and that the latter would fall firmly in the hands of the vehicle manufacturer. It was stated on several times that your car (if you actually owned one at that point in time, vehicle pooling/robotaxi much more part of the overall picture) would be insured as with any another insurable asset, albeit a very expensive one. Accident wise, liability side would be very much the manufacturers domain so long as the vehicle had been properly maintained and (software) updated in accordance with manufacturers instructions.
 
So we have a brave new world where...

-- AP systems become safer than human drivers (even L3 will probably get there fast)
-- ... BUT automakers are held legally liable for accidents when AP is engaged
-- ... AND so they (the automakers) disable the AP systems because they cannot afford the liability
-- ... AND so more people die in driving accidents with cars driven by bad human drivers

Wonderful.

If they're safer, then the manufacturer will be able to afford the liability because their insurance premium will be lower.

It's the UK, so nuisance suits are much less likely.

The recommendation was an obvious one.
 
  • Like
Reactions: diplomat33
If they're safer, then the manufacturer will be able to afford the liability because their insurance premium will be lower.

It's the UK, so nuisance suits are much less likely.

The recommendation was an obvious one.

The manufacturer won't pay it, they will just add the cost on to the vehicle somehow. Either at purchase or by making the self driving some kind of subscription service.
 
  • Like
Reactions: WattPwr
That kind of thing has been suggested before, the problem is creating a viable sensor that can tell if someone is actually above the legal limit reliably.

There is a reason why those breath testers are only used as the first step, and indicator that needs to be confirmed with a blood test. They just are not that reliable or accurate and it would be a problem if your car wouldn't start because you took some medication that caused a misdiagnosis.

Also it would be problematic from a practical point of view. The breath testers have disposable parts for you to put your lips around, they are used once and discarded. Creates a lot of waste plastic. Cleaning might be an option but especially these days with COVID...

And after all that it wouldn't detect things like drug use or tiredness.

As usual this totally misses the point. It's not blood alcohol per se that is the concern, it's the effect it has on the ability to drive. My wife has a phenomenally low tolerance for alcohol, and after literally ½ glass of wine (making her way below the legal limit) she is truly incapable of driving. Conversely, I can tolerate far more (really). This means that, according to the law, if we go out to a bar it should be my incapacitated wife who drives, not me. Similarly, diabetics can sometimes lose control of blood sugar, which produces very similar effects to alcohol intoxication, yet legally they can still drive.

If you want to be scientific about driving "while under the influence" you should look at the direct effect on driving competence, not an averaged "one size fits all" check of blood chemistry. As has been noted many times in the past, the correct way to test is some form of reaction testing. To an extent, the US informal sobriety tests (walking a line, touching fingers together etc), inaccurate and humorous though they are, and a far better indication of driving capacity than blood alcohol level.

This is a classic case of indirect regulation getting out of control. "We want to regulate B (driver incapacitation). A (alcohol) causes B. It's easier to regulate A than B, therefore we regulate A." This fails when (a) A is not the only cause of B, or (b) when A only sometimes causes B.
 
This news is related to UNECE homologation between 60+ UN countries from last June. Germany, South Korea, Japan, Austria, and many many others have already adopted these rules

This might sound counter intuitive, but this is good news for automakers as this is what they want/need. (at least for the ones trying to make and sell self driving cars)

Also, to be clear, Tesla Autopilot and Tesla FSD does not fall into the definition of AV for these new UK laws.
 
If they're safer, then the manufacturer will be able to afford the liability because their insurance premium will be lower.

It's the UK, so nuisance suits are much less likely.

The recommendation was an obvious one.

I think you are being naive. If the makers have liability they are going to shut the systems down .. all of them, even the mild "lane keeping" systems. Period. Net result .. no driver assists, more deaths.

No L3 or above self-driving car is ever going to be foolproof. But if they can be shown to be significantly safer than the average human driver, then there is a clear advantage to their widespread deployment. In saved lives, reduced injuries, and fewer overall accidents.

But pushing all liability onto the makers raises the bar for deployment way beyond just being better than a human driver. In fact, it probably raises the bar so high the system would need to be near perfect, which is unrealistic for the foreseeable future and given the nature of AI/NN technology.

If/when Tesla release FSD you can be sure that they will make it clear that the driver is responsible for safety.
 
  • Like
Reactions: WattPwr
As usual this totally misses the point. It's not blood alcohol per se that is the concern, it's the effect it has on the ability to drive. My wife has a phenomenally low tolerance for alcohol, and after literally ½ glass of wine (making her way below the legal limit) she is truly incapable of driving. Conversely, I can tolerate far more (really). This means that, according to the law, if we go out to a bar it should be my incapacitated wife who drives, not me. Similarly, diabetics can sometimes lose control of blood sugar, which produces very similar effects to alcohol intoxication, yet legally they can still drive.

If you want to be scientific about driving "while under the influence" you should look at the direct effect on driving competence, not an averaged "one size fits all" check of blood chemistry. As has been noted many times in the past, the correct way to test is some form of reaction testing. To an extent, the US informal sobriety tests (walking a line, touching fingers together etc), inaccurate and humorous though they are, and a far better indication of driving capacity than blood alcohol level.

This is a classic case of indirect regulation getting out of control. "We want to regulate B (driver incapacitation). A (alcohol) causes B. It's easier to regulate A than B, therefore we regulate A." This fails when (a) A is not the only cause of B, or (b) when A only sometimes causes B.

Legally it is the point. But in any case the only other technology for detecting this at the moment is looking for drifting around in the lane or over the lines. Some cars already have that, they issue a warning that the driver is tired.
 
We're really getting close to the science fiction future. Take any of the concepts: the series Upload for example. People are driven in 'pods' with no control other than the destination. It is considered inconceivable that they should crash, and it takes overt hacking to engage manual driving, and implied hacking to cause the driver fatality. Interestingly, no official investigation is conducted into the crash.

Actually there is one control the occupant has. They can enable "Driver Priority" or "Pedestrian Priority" mode. Discuss...

Waymo is basically doing automated driving. Are they at full liability for everything including passenger injuries? I wonder what the passenger agreement says when you sign up.
 
We're really getting close to the science fiction future. Take any of the concepts: the series Upload for example. People are driven in 'pods' with no control other than the destination. It is considered inconceivable that they should crash, and it takes overt hacking to engage manual driving, and implied hacking to cause the driver fatality. Interestingly, no official investigation is conducted into the crash.

Johnnycab called it.

Waymo is basically doing automated driving. Are they at full liability for everything including passenger injuries? I wonder what the passenger agreement says when you sign up.

Waymo accepts all liability. Passengers can't do anything, they sit in the back.
 
If/when Tesla release FSD you can be sure that they will make it clear that the driver is responsible for safety.
Who is responsible when there is no one in the car? A fully autonomous system can operate without anyone in the vehicle.

So let's say, for example, you have your car drop you off at work. On the way to work, in your example, the occupant of the car (you) is responsible for any accidents. After you are dropped off at work, the car returns to your house to take your wife to work. During this driving segment, are you responsible for any accidents? Is your wife? Is the car?
 
I think you are being naive. If the makers have liability they are going to shut the systems down .. all of them, even the mild "lane keeping" systems. Period. Net result .. no driver assists, more deaths.

I think you are being naive. You propose a scenario in which the assignment responsibility to the passenger is entirely justified by the (hypothetical) non-viability of the product otherwise. That is, the passenger must be scapegoated so that the business isn't threatened by its own faults. For starters, this already partly paints a ****ing dystopian corporate-dominated future picture right there, but onward.

1) There already are companies attempting the fully driverless approach, taking liability (Waymo, for one).
2) I doubt evident scapegoating will be accepted by the public or even be legal to begin with. The passenger isn't driving the autonomous vehicle; this point is absolutely trivial, and is a big part of what'd encourage quality in the service and proper dedication to safety.

I don't just believe that you're wrong in that scapegoating the users is the only possibility for AVs to be viable. I specifically think it's one that will not work, as it depends on our willingness to submit to the most abhorrent of moral standards in a cyberpunk corpo-driven sort of style, for ****'s sake. Settling blame and punishment on someone for a company's fault because it benefits the business. If that's necessary, the service has no business existing to begin with.

If/when Tesla release FSD you can be sure that they will make it clear that the driver is responsible for safety.

No argument there. FSD will remain in perpetual beta as is Tesla custom. Tesla is, as we all know, not just a car company; it is also a beta-software company. Enjoy your purchase!
 
Enjoy your purchase!
Why are you so butthurt about something you don't even own?
There are so many ppl on these forums that have not and will not buy FSD but post incessantly about FSD.
I have a hard time understanding that...
I understand folks who bought FSD and regret the purchase, etc. But to not own something yet invest so much time discussing/bashing it is something else entirely!
 
  • Like
Reactions: WattPwr
Why are you so butthurt about something you don't even own?
There are so many ppl on these forums that have not and will not buy FSD but post incessantly about FSD.
I have a hard time understanding that...

I'm making fun of it, and I figured that'd be evident if you had only quoted the rest of that paragraph. Only one of us is butthurt.

Anyway, what do you think is unfathomable about criticism of terrible business practices? I am interested in autonomous driving, and used to be interested in Tesla in particular, for this reason, a few years ago. That's soured since, and hence my comment.

I do wonder that you have to gain with your relentless defense of Tesla FSD and attack on competitors by, say, ensuring diplomat33's comments always have one disagree, even when he's just factually correct, and only seldom actually responding with a why.
 
  • Disagree
Reactions: mikes_fsd
I do wonder that you have to gain with your relentless defense of Tesla FSD
That is very simple - and selfish - I have seen and experienced more progress with my own Tesla cars and Autopilot updates then the rest of the "industry leaders" combined and I want that progress to continue.
No amount of demos or PR releases by "autonomy" companies will compare to what I get to drive every day.

and attack on competitors by, say, ensuring diplomat33's comments always have one disagree.
Disagreeing is not attacking...(please learn the difference) that is the problem with society in general at the moment.
 
  • Like
Reactions: WattPwr
That is very simple - and selfish - I have seen and experienced more progress with my own Tesla cars and Autopilot updates then the rest of the "industry leaders" combined and I want that progress to continue.
No amount of demos or PR releases by "autonomy" companies will compare to what I get to drive every day.


Disagreeing is not attacking...(please learn the difference) that is the problem with society in general at the moment.

Fair enough with that first answer.

As for the rest, I specifically meant the systematic disagree with no explanation why, and even on mere expositions of fact. Disagreeing is indeed not attacking, but your brand of pressing that disagree button (which isn't necessarily the act of disagreeing) always seemed spiteful. I can't recall the particular examples so I concede regardless.

Derisive criticism isn't being butthurt either though...(please learn the difference) that is the problem with society in general at the moment. Hopefully these also answer your first question, though.
 
Disagreeing is indeed not attacking, but your brand of pressing that disagree button (which isn't necessarily the act of disagreeing) always seemed spiteful.
For what it's worth, I only press the "disagree" if I actually disagree with something he said.
Mind you, there is a LOT to disagree with.

But if I was to actually comment on every item I disagreed with, I would not get anything else accomplished during the day.
 
  • Like
Reactions: WattPwr
Who is responsible when there is no one in the car? A fully autonomous system can operate without anyone in the vehicle.

So let's say, for example, you have your car drop you off at work. On the way to work, in your example, the occupant of the car (you) is responsible for any accidents. After you are dropped off at work, the car returns to your house to take your wife to work. During this driving segment, are you responsible for any accidents? Is your wife? Is the car?

My comments were in the context of FSD, which is not L4/L5 and so is not germane.
 
Who is responsible when there is no one in the car? A fully autonomous system can operate without anyone in the vehicle.

So let's say, for example, you have your car drop you off at work. On the way to work, in your example, the occupant of the car (you) is responsible for any accidents. After you are dropped off at work, the car returns to your house to take your wife to work. During this driving segment, are you responsible for any accidents? Is your wife? Is the car?

I think the manufacturers will find any reason to deny liability. Here are some examples:
  • You failed to maintain the car properly, also to clean the cameras when required
  • You failed to update the software in a timely manner
  • A non-approved third party did maintenance on the car
  • Someone hacked the car's software
  • Someone used a non-approved modification in some manner
  • Someone was responsible for external environment sabotage (eg: street sign missing)
  • You started your trip with the car in an unsafe situation (eg: next to a sinkhole, in a protest, with someone hiding under the car)
  • You transported illegal cargo, exceeded the occupant limits
  • Occupants interfered with the operation of the car
  • Acts of God
  • Unforeseen situations that even a "responsible driver" would not have been able to handle
  • You somehow failed to meet the terms of the legal agreement in some nitpicking manner
And of course
  • The other driver is responsible for the accident