Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
That is not how self-driving cars work. If you are expected to be in control of your self-driving car in case it misbehaves, then it is not true self-driving. True self-driving is where you are NOT expected to control the car if something goes wrong. A true self-driving car is expected to handle everything on its own without any human oversight. A true self-driving is its own driver. Think of this way: if you are riding in the back seat of a taxi and the taxi gets into accident, are you responsible? Of course, not! The taxi driver will get sued but not you. The taxi driver was responsible, you were just a passenger. That's how true self-driving works. The car is the driver, you are the passenger. So if the car gets into accident, it is responsible since it was the driver. Of course, you can't sue a car. The manufacturer will get sued.
To add to this, let's imagine you purchase an elevator for an office building. You maintain it per the manufacturer's specifications. You don't overload it. But one day something on the tracks jams, and due to a software fault the elevator doesn't detect this. Instead of stopping and sounding an alarm, it keeps pushing until the motors fail causing a fire in the penthouse, jamming the elevator car, and trapping the passengers.

Who is responsible for the damage to the building, the passengers, and the elevator itself? Certainly not the elevator rider nor the owner of the building.
 
We live in a litigious society. If my pet Siberian tiger eats my neighbour, who will his wife sue. Me or the tiger ? As the owner of the tiger I should have had control of it. Same as a misbehaving car.
Nobody would sue a tiger because tigers have no assets. You would sue Siberia, since that's where the tiger programming was originally established.

Of course, this is as ridiculous as your tiger analogy, given that tigers are wild animals, not human created machines.

You need to go back to a real law school. Your ebay law degree was overpriced.
 
Who is responsible for the damage to the building, the passengers, and the elevator itself? Certainly not the elevator rider nor the owner of the building.

You’re not wrong, but in your hypothetical example the owner of the building will be named in the lawsuit filed by the trapped rider. Prior to the discovery phase (where the rider’s investigators get access to all the maintenance records and physical evidence) there hasn’t been a determination of fact about whether or not the building owner did anything that could have reasonably (in the eyes of a jury) contributed to the incident, and therefore the inconvenience, injury, or emotional distress suffered by the rider.

So there is still risk borne by the building owner, even if she did everything right but still must bear the cost of defending against that sort of lawsuit.

Bringing the analogy back to L3 self driving—when the inevitable collision occurs, the owner of the vehicle (and her insurance company) will still be on the hook for legal representation and other litigation costs. That’s not nothing.
 
  • Like
Reactions: phonetele226
Nobody would sue a tiger because tigers have no assets. You would sue Siberia, since that's where the tiger programming was originally established.

Of course, this is as ridiculous as your tiger analogy, given that tigers are wild animals, not human created machines.

You need to go back to a real law school. Your ebay law degree was overpriced.
You guys are missing the whole point of this. We will NEVER get to a full self driving car. Mankind will NEVER abdicate the responsibility for his life to a machine.
 
You guys are missing the whole point of this. We will NEVER get to a full self driving car. Mankind will NEVER abdicate the responsibility for his life to a machine.

That's like people saying "mankind will never trust riding in metal tubes flying through the air", yet people fly in airplanes all the time. We already have some full self-driving cars now. There will be more. You can't stop technology.
 
FSD in its current form is no different than cruise control or seatbelt function, a Level 3+ system operating within its OEDR is a different story.

I'm not aware of any "liability problem" though in the way you're spinning it, everything in terms of liability seems quite clear. You don't own the driving task when using a Level 3+ system within its OEDR, something Level 4+ might not even have a steering wheel or pedals. It's like hopping in a Waymo or Cruise except you own the vehicle itself, but you aren't responsible for what the vehicle does when the system is operating.

The only problem I see is the manufacturer feeling comfortable taking on the risk, which would be why Tesla needs the system showing levels of safety/reliability far in excess of human drivers. If FSD was merely as good as the average human driver and all accidents were now owned by Tesla, I think they'd have a bad time.
If you own the car, you own the liability. That can be offset by liability assigned -- normally by a jury, mind you, and on a case-by-case basis -- to the manufacturer based on the failure of the car to perform the way it states in the manual, as was characterized by the manufacturer, and/or as a reasonable driver may expect. The court does not care about SAE driving levels and OEDRs. Again, if the legislature in a given jurisdiction decides to step in and eff it up, then there could be some issues that will have to be fleshed out. But the way it works today in most places is all settled.
 
  • Like
Reactions: phonetele226
If you own the car, you own the liability. That can be offset by liability assigned -- normally by a jury, mind you, and on a case-by-case basis -- to the manufacturer based on the failure of the car to perform the way it states in the manual, as was characterized by the manufacturer, and/or as a reasonable driver may expect. The court does not care about SAE driving levels and OEDRs. Again, if the legislature in a given jurisdiction decides to step in and eff it up, then there could be some issues that will have to be fleshed out. But the way it works today in most places is all settled.
I can’t tell if you’re disagreeing or not. If the owner’s manual says that you can sleep in the backseat while the car drives itself then obviously a jury would find the manufacturer liable for driving the car.
The court of course does care about SAE levels here in California as they are included in the autonomous vehicle regulations.
 
Who is responsible for the damage to the building, the passengers, and the elevator itself? Certainly not the elevator rider nor the owner of the building.
Yes the owner of the building. She owns the elevator. However, if the elevator failed due to design problems, then the elevator manufacturer may be assigned some (or potentially all as in your scenario) of the liability to offset the owner's.
 
Last edited:
I can’t tell if you’re disagreeing or not. If the owner’s manual says that you can sleep in the backseat while the car drives itself then obviously a jury would find the manufacturer liable for driving the car.
The court of course does care about SAE levels here in California as they are included in the autonomous vehicle regulations.
I am saying that the owner of the car owns the liability (don't know how I can make that clearer) and that liability may be offset by liability of the manufacturer for design/manufacturing defects, improper training, falsely claiming what the capabilities of the system are, etc. As far as the courts caring about SAE levels in California, the courts don't promulgate the regulations, the executive branch does under authority granted to it by the California legislature. A lawyer may use the regulations in a civil court to attempt to assign liability to the manufacturer. But I would be willing to bet, at least today, that a judge is not going to dismiss a civil torts case against a driver -- in California or otherwise -- because a car owned by that driver was supposedly operating under SAE level 3 or whatever. I have to admit, however, I can't speak intelligently about torts in non-US courts.

What I am trying to address is this idea that there is a "liability problem" that needs to be worked out in the courts or the legislature before autonomous cars "take to the road," "become ubiquitous," "start killing people," or whatever other euphemism we hear from the media. There is no problem. Our courts and the caselaw is already perfectly able to handle it, and the caselaw will evolve over time (as it always does) and the courts will handle that too. Doesn't require anything special to be done to address any particular problems.
 
Last edited:
Mercedes is testing Drive Pilot in California. They've only reported one collision so far which probably means they're not testing all that much though.
Yes - no "No million mile real-world testing." as I said. Yet, already a collision. IIRC, they are only testing with a handful of cars. At that rate they would have to test for several years to get statistically valid disengagement rates - if they are anywhere near human level.
 
Who is responsible for the damage to the building, the passengers, and the elevator itself? Certainly not the elevator rider nor the owner of the building.
The elevator / software manufacturer problem.

Not a lawyer - but unless some kind of negligence can be shown on the part of the owner, they are not responsible.

It is not any different from a UPS delivery person falling in front of your house (on your property). It is not your fault - unless there is some kind of negligence on your part that can be proved.
 
I can’t tell if you’re disagreeing or not. If the owner’s manual says that you can sleep in the backseat while the car drives itself then obviously a jury would find the manufacturer liable for driving the car.
The court of course does care about SAE levels here in California as they are included in the autonomous vehicle regulations.
Like the owners manual will EVER say you can fall asleep while the car drives. You must be smoking something.
 
  • Funny
Reactions: Daniel in SD
Yes - no "No million mile real-world testing." as I said. Yet, already a collision. IIRC, they are only testing with a handful of cars. At that rate they would have to test for several years to get statistically valid disengagement rates - if they are anywhere near human level.
Not if you use customers to test. ;)
We have no idea how much they're testing until the DMV releases the yearly mileage early next year. Yeah I would expect to get rear ended more than once per million miles driving in LA traffic so I agree that it's probably much less than a million miles so far.
It is not any different from a UPS delivery person falling in front of your house (on your property). It is not your fault - unless there is some kind of negligence on your part that can be proved.
This is also an important part of the much maligned SAE standard. An L3-5 system cannot be engaged outside it's operational design domain so that severely limits the ability of the manufacturer to blame the users. With an L3 system the driver is required to detect obvious mechanical failures (like a wheel bearing about to fall apart) but not to know when and where it can engaged.
 
The elevator / software manufacturer problem.

Not a lawyer - but unless some kind of negligence can be shown on the part of the owner, they are not responsible.

It is not any different from a UPS delivery person falling in front of your house (on your property). It is not your fault - unless there is some kind of negligence on your part that can be proved.
Tesla says the driver must be ready to intervene at any time while using FSD. Do you really think their lawyers would ever let them remove that stipulation and be liable ? Not in a million years.
 
If you own the car, you own the liability. That can be offset by liability assigned -- normally by a jury, mind you, and on a case-by-case basis -- to the manufacturer based on the failure of the car to perform the way it states in the manual, as was characterized by the manufacturer, and/or as a reasonable driver may expect. The court does not care about SAE driving levels and OEDRs. Again, if the legislature in a given jurisdiction decides to step in and eff it up, then there could be some issues that will have to be fleshed out. But the way it works today in most places is all settled.
You're referring to product liability laws?

I think everyone expects a shift to product liability when a Level 3+ system is active, that is really what defines Level 3+. If the human in the vehicle owns the DDT and the liability, it's not Level 3+. If that's the case, I don't think they'll ever be out driving around empty in Robotaxi mode.

For autonomous systems that can be flipped on/off, authorities like the NHTSA are working with manufacturers to mandate robust data collection that Tesla already has and that will be used to determine who is liable. They're producing massive amounts of data around stuff like driver monitoring alerts leading up to an event, when the system was engaged/disengaged, input to wheel torque, camera feeds, etc.
 
Please post link to the manual or text thereof. That would never fly in the US.
The US is a big place and self-driving cars are regulated by the states. Mercedes is planning on releasing it here in California where the law allows it.


"For customers, this means the ultimate driving experience. They can relax or work and win back valuable time."

If you want to learn more about it there is tons of information available on the internet.
 
  • Funny
Reactions: AlanSubie4Life