Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Self-driving car users should have immunity from offences - UK Law Commission reports

This site may earn commission on affiliate links.
I can see this, if accepted by Government, nailing the coffin on FSD - not just from Tesla, but all car manufacturers. Their legal teams will s**t themselves at the thought of paying out for the potential number of "mistakes" that could occur, and from having to fight any claims in court. Full (L5) autonomy is looking highly unlikely to be provided, even if Tesla or anyone else cracked it!



UK's Law Commissions propose that the car manufacturers should be liable for a wide range of motoring offences, including dangerous driving, speeding and jumping red lights while the vehicle is "self-driving".
The person in the driving seat would no longer responsible for how the car drives; instead, the company or body that obtained authorisation for the self-driving vehicle would face regulatory sanctions if anything went wrong.
Under the recommendations, which will be considered by the English, Welsh & Scottish governments, the user-in-charge of a self-driving vehicle would still retain responsibilities such as such as carrying insurance and ensuring that children wear seatbelts.

The report says that there should be a clear distinction between driver support and self-driving and that a vehicle should only be classified as the latter if it is safe even when an individual is not monitoring the driving environment, the vehicle or the way that it drives.

The commissions say it should be permissible for an autonomous car to create a transition demand for the driver to take control if it confronts an issue it cannot deal with but it must make the demand in a clear fashion, give the individual sufficient time to respond and be able to mitigate the risk if a human fails to take over, by at least coming to a stop.
 
One reason why Tesla have started to offer their own car insurance? Also likely that if they ever get FSD working it will be on a rental basis which effectively is another way of taking insurance premiums from drivers.
More confusing that the article stated self-driving drivers should be able to take over within 10 seconds (which means that it isn't self-driving if you can't relax) and that they remain responsible for bad weather situations etc. None of that goes with Elon's claim that FSD will be safer than manual driving.
 
  • Like
Reactions: Dommmm
Tesla or any other company making the kit that allows a car to drive have to be liable.

In an autonomous taxi, it's not the riders or owners fault. It's the hardware and software service promising to do the task for you. We need it to be the AI provider to be liable or it'll be "sorry grandma, you should have grabbed the wheel and avoided the pothole"

This is sensible, unlike the exploitation Elon is doing because a lack of national governance and advertising laws Elon brilliantly exploits in the US.

2016 cars that have FSD will be long scrapped before FSD works on single lane roads from the 1800s with 2 way traffic.
 
...
The commissions say it should be permissible for an autonomous car to create a transition demand for the driver to take control if it confronts an issue it cannot deal with but it must make the demand in a clear fashion, give the individual sufficient time to respond...

I agree with the thread's title.

It's either self driving or not.

If it's not then human driver is responsible.

If it is then a blind person can be in the seat of a self driving car and that person is not responsible to take over at any time.
 
I figure the real reason to write the law this way is that it will force the manufacturers to get it right, before FSD is released into the wild. As things stand, FSD is whatever Musk says it is -- a situation that means decisions on release, operation mode, etc. will be influenced by profitability considerations (and perhaps even the vicissitudes of personality).
 
  • Like
Reactions: linux-works
It makes a lot of sense to me. Sure the legal teams will be quite busy, but they'll be driven by the sales teams who want to make a sale.

That being said, in many cars you already have a legal agreement when you start it up (usually around the satnav), I can imagine this getting more complex and trying to wriggle out of the manufacturer being responsible.
 
They will just charge people more in order to cover the liability, why else is Tesla progressively increasing the price. The only cost in the software is development, and I bet that's a fraction of the cost today. Elon has talked about full autonomy being over $50K to buy when it's fully complete.

Tesla are also going to be creating autonomous taxis, so it's no change to that liability.

Nothing to see here, move along.
 
One reason why Tesla have started to offer their own car insurance? Also likely that if they ever get FSD working it will be on a rental basis which effectively is another way of taking insurance premiums from drivers.
More confusing that the article stated self-driving drivers should be able to take over within 10 seconds (which means that it isn't self-driving if you can't relax) and that they remain responsible for bad weather situations etc. None of that goes with Elon's claim that FSD will be safer than manual driving.
I have to say that sums the autopilot feature up for myself, the first time I had it switched on I instantly new it wasn’t for me. Admittedly it’s handy if on a motorway and you want a drink, it saves steering with your knees for a bit!
But the whole ‘apply pressure’ means you just spend your time monitoring what is unfolding and waiting to react, so you don’t actually ever relax.
I guess on a side note where the sensors and that level of autonomous driving really comes into its own for me is collision warning and lane assist which I still think is incredible :)
 
I can see this, if accepted by Government, nailing the coffin on FSD - not just from Tesla, but all car manufacturers. Their legal teams will s**t themselves at the thought of paying out for the potential number of "mistakes" that could occur, and from having to fight any claims in court. Full (L5) autonomy is looking highly unlikely to be provided, even if Tesla or anyone else cracked it!
That's odd that you think that. Car manufacturers have already been thinking on that, and do feel that if the car is driving itself, passengers aren't liable. Hence also the effort from Tesla and now GM to offer insurance themselves.
 
They will just charge people more in order to cover the liability, why else is Tesla progressively increasing the price. The only cost in the software is development, and I bet that's a fraction of the cost today. Elon has talked about full autonomy being over $50K to buy when it's fully complete.

Tesla are also going to be creating autonomous taxis, so it's no change to that liability.

Nothing to see here, move along.
That's dumb. I'll tell you why that's dumb; because Elon thought that we'd already have fully autonomous cars by now.
 
That's dumb. I'll tell you why that's dumb; because Elon thought that we'd already have fully autonomous cars by now.
Musk in January 2019:
“I think we will be feature complete, full self-driving, this year. Meaning the car will be able to find you in a parking lot, pick you up and take you all the way to your destination without an intervention. This year. I would say I am of certain of that, that is not a question mark.”

That’s by no means his only ridiculously inaccurate prediction. What about the one million self driving robo taxis that he promised would be on the road by the end of 2020? He hasn’t got a clue.

“FSD” has barely progressed since I got my car in Sept 2019. Does anyone seriously think we’ll have level 5 autonomy by the end of the decade?

But I do agree that manufacturers should be responsible if their autonomous vehicles fail.
 
Just fyi, this topic is not about Elon's statements of having complete self driving in the past. This topic is about UK's law commission proposal to assign liability to the car manufacturers of self-driving cars, not the passengers.
But if the car is designated as being fully autonomous then clearly the responsibility will not lie with anyone in the car ... they will not be driving! Maybe it's confusing because we think about "Autopilot" or "FSD" as the exist today where clearly this would be a big issue. Truly fully autonomous vehicles wouldn't need any intervention ... and ultimately may not even include the option for human intervention. You wouldn't be expected to take over driving responsibility in a present day taxi ... it would be the same with a Robo Taxi or fully autonomous car ... the person travelling in the car may not even be a qualified driver.
 
What’s the alternative? The owner being responsible even though they’re not necessarily present? Nobody? I’ve always thought it would need to be the manufacture which then raises a whole heap of other ramifications

- modifications and maintenance of the car or the car needs to be able to sufficiently self test to know it’s able to take responsibility

- the transition between driver and system. There’s already been rules around 7 or so seconds for the car to hand back control to the driver in the event the car knows it won’t be able to cope (level 3 and 4 relevant). Tesla are doing nothing in this area, at least nothing we’ve seen. What you can’t have is the car going ‘bugger.. can’t cope’ as it’s about to crash and hand over to the driver is fractions of a second for the driver to cope. That transition needs to be laid down in law.

- speeding and errors in databases and maps. Even the good systems such as Mobileye aren’t perfect, but if the cars driving it needs to be much much more accurate and even then how mistakes are handled need to be agreed. Musk might get a lot of points fairly quickly otherwise.

- evidence of who (human or computer) was driving and when. You can’t assume autonomy is perfect, aside from speeding what about failing to give way, not follow the direction of a police officer, enter a one way street the wrong way etc etc. Video captured and the registered keeper asked who was driving, it would be easy to always blame the car. I can see some visible indication on the car being needed

- what penalties do we place on the manufacture or do they just get told to turn it off

- software updates and how well retested they are

That’s just a quick list of some sizeable challenges and why we’re al long way off anything other than very limited level 3 like the Merc system which as I understand it is almost pointless given the speed and road type restrictions being largely incompatible with each other
 
*pile-up occurs on the road 100ft ahead*

*you continue snoring*

*KLAXON*

“Warning, <driver name>, Autopilot system cannot continue, you are forewarned that full responsibility is being transferred back to you, please say “I want to know more” if you wish to hear my legal terms, otherwise prepare to take control in 3, 2, 1” … “happy driving!”