Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
I don’t think the comparison can be made between a business running robotaxis and Joe Public with his vehicle. It’s like comparing Delta Airlines and their insurance company and a private pilot with his own plane.
Actually, it's a good comparison. In both cases, the passengers are not controlling the automobile. All control comes from the manufacturer's software. The only difference is who titled the vehicle. Assuming, of course, that the private owner has not made any unapproved modifications to the car.
 
And my point has always been a true self driving vehicle will always confer responsibility to the owner of said vehicle. The manufacturer won’t touch it in terms of liability with a ten foot pole.

This is actually a pretty useful distinction. Waymo is both the manufacturer of the software and the owner of the vehicle. But if, for example, Waymo decided to sell their vehicles to Hertz tomorrow, who would be responsible for a crash? Waymo or Hertz? I imagine it would be spelled out in the purchase agreement contract language, regardless of SAE levels.
 
Maybe at 37mph. Definitely way easier but I have a feeling we're going to see some crazy edge cases.
I'll be laughing first time an L3+ vehicle gets damaged from a common pothole.

Changing subject: Here is GM saying they will have L3+ by mid decade. No details. Doesn't say if it is mobileye, cruise, or some other tech (nvidia?).
 
Last edited:
  • Like
Reactions: Daniel in SD
Mercedes is responsible according to the law for their Drive Pilot L3 system.
Nobody is saying that the car is responsible. The manufacturer of the car is responsible if the car is driving. Seems very intuitive to me.
It is likely that any self-driving car will have the expected liability cost built into the purchase, or subscription, price for that feature. Very much like things are today for product liability of anything.

A certain portion of the cost of any Tesla goes toward covering the expected liability for design and manufacturing defects, just as a portion of the cost goes to pay for the expected warranty claims. It's not itemized, but it's in there.
 
Let’s revisit this liability topic (in regards to cars sold to consumers) around 2027-2029. There might be a brief window before that, but it will rapidly disappear again would be my guess. First mover is going to get a beat down. But hope will spring eternal.
 
And my point has always been a true self driving vehicle will always confer responsibility to the owner of said vehicle. The manufacturer won’t touch it in terms of liability with a ten foot pole.

"True self-driving" implies the owner is a passenger in the car, not the driver in any way. If that is the case, then the owner cannot be responsible. How could they be when they are just a passenger and not responsible for any of the driving tasks? For true self-driving, the manufacturer would be responsible since they designed and built the system that was responsible for all the driving tasks. And if the manufacturer does not want to accept responsibility, they can designate their driving system L2 but then that is not true self-driving.

But that is one reason why we have the levels of automation because terms of "self-driving" or "true self-driving" can mean different things to different people.

L2 = the car performs some driving tasks but the driver still performs some driving tasks too. Driver is always responsible.
L3 = the car performs all driving tasks when it is on but the driver is the fallback. The car is responsible since it performs all driving tasks but admittedly there could be a grey area in some instances maybe, since driver was the fallback.
L4/5 = the car performs all driving tasks. The car can also perform the fallback. There is no human driver at all. The owner is a passenger. This is clear cut. The car, ie the manufacturer, is responsible.

For me, "true self-driving" is only L4/5.

I think that may not be relevant in terms of the law. The car is not sentient.

The self-driving car does not have to be sentient. But it is performing all the driving tasks therefore it is responsible for driving.

This is actually a pretty useful distinction. Waymo is both the manufacturer of the software and the owner of the vehicle. But if, for example, Waymo decided to sell their vehicles to Hertz tomorrow, who would be responsible for a crash? Waymo or Hertz? I imagine it would be spelled out in the purchase agreement contract language, regardless of SAE levels.

The manufacturer designed and created the self-driving software. They would logically be responsible. I don't see how the owner of the vehicles would be responsible for something they had nothing to do. They are simply using the vehicles, they did not design or create the software doing the actual driving. So in your scenario, if Waymo sold or leased their 5th Gen I-Paces to Hertz for a rental service, I think Waymo would still be responsible for any accidents. The exception I see is maintenance. Hertz would likely be responsible for vehicle maintenance. So if the accident was caused by poor maintenance, then Hertz could be liable. But assuming the vehicle was in good working order, if the Waymo Driver caused or failed to avoid an accident, Waymo would be liable, not Hertz. That's how I see it.

 
Last edited:
"True self-driving" implies the owner is a passenger in the car, not the driver in any way. If that is the case, then the owner cannot be responsible. How could they be when they are just a passenger and not responsible for any of the driving tasks? For true self-driving, the manufacturer would be responsible since they designed and built the system that was responsible for all the driving tasks. And if the manufacturer does not want to accept responsibility, they can designate their driving system L2 but then that is not true self-driving.

But that is one reason why we have the levels of automation because terms of "self-driving" or "true self-driving" can mean different things to different people.

L2 = the car performs some driving tasks but the driver still performs some driving tasks too. Driver is always responsible.
L3 = the car performs all driving tasks when it is on but the driver is the fallback. The car is responsible since it performs all driving tasks but admittedly there could be a grey area in some instances maybe, since driver was the fallback.
L4/5 = the car performs all driving tasks. The car can also perform the fallback. There is no human driver at all. The owner is a passenger. This is clear cut. The car, ie the manufacturer, is responsible.

For me, "true self-driving" is only L4/5.



The self-driving car does not have to be sentient. But it is performing all the driving tasks therefore it is responsible for driving.



The manufacturer designed and created the self-driving software. They would logically be responsible. I don't see how the owner of the vehicles would be responsible for something they had nothing to do. They are simply using the vehicles, they did not design or create the software doing the actual driving. So in your scenario, if Waymo sold or leased their 5th Gen I-Paces to Hertz for a rental service, I think Waymo would still be responsible for any accidents. The exception I see is maintenance. Hertz would likely be responsible for vehicle maintenance. So if the accident was caused by poor maintenance, then Hertz could be liable. But assuming the vehicle was in good working order, if the Waymo Driver caused or failed to avoid an accident, Waymo would be liable, not Hertz. That's how I see it.
We live in a litigious society. If my pet Siberian tiger eats my neighbour, who will his wife sue. Me or the tiger ? As the owner of the tiger I should have had control of it. Same as a misbehaving car.
 
We live in a litigious society. If my pet Siberian tiger eats my neighbour, who will his wife sue. Me or the tiger ? As the owner of the tiger I should have had control of it. Same as a misbehaving car.

But who has deeper pockets, little ol' phonetele226 or billion dollar company Tesla?

Car companies are held accountable for all kinds of design flaws (i.e. ignition switches, exploding gas tanks, etc.). The owner of the car is just as responsible for those issues as they are the design of the software controlling the car.
 
  • Like
Reactions: Doggydogworld
We live in a litigious society. If my pet Siberian tiger eats my neighbour, who will his wife sue. Me or the tiger ? As the owner of the tiger I should have had control of it. Same as a misbehaving car.

That is not how self-driving cars work. If you are expected to be in control of your self-driving car in case it misbehaves, then it is not true self-driving. True self-driving is where you are NOT expected to control the car if something goes wrong. A true self-driving car is expected to handle everything on its own without any human oversight. A true self-driving car is its own driver. Think of this way: if you are riding in the back seat of a taxi and the taxi gets into accident, are you responsible? Of course, not! The taxi driver will get sued but not you. The taxi driver was responsible, you were just a passenger. That's how true self-driving works. The car is the driver, you are the passenger. So if the car gets into accident, it is responsible since it was the driver. Of course, you can't sue a car. The manufacturer will get sued.
 
Last edited:
Elon has promised that a Tesla will be able to go from NYC to LA with no human input. Via Smart Summon. Which means no one is even INSIDE of the car during this voyage. Based on that, it would imply that Tesla is accountable for the entire trip across the country and if an accident occurs which is the fault of the car, then Tesla assumes liability.
 
  • Like
Reactions: Doggydogworld