Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Enhanced Autopilot is BACK!!

This site may earn commission on affiliate links.
As @Ofarlig points out, if a company like Mercedes is willing to cover the liability just like a standard new car warranty, then everything is seamless and smooth. There's no need to go to court to prove that the elevator's software caused the accident.

It takes time to go to court to prove there's a design flaw from automobile manufacturers. It took 6 years in court before GM would settle Ignition Switch Class Action.

Tesla's way of not covering liability is not consumer-friendly. I understand that a driver can be blamed right now with Tesla's automation system but Tesla needs to communicate its confidence in its own technology by covering liability when there's no driver in a Robotaxi.
Liability for car accidents is decided in court unless someone admits fault. I'm so confused about this whole issue. If the self-driving system is found to be at fault then that is by definition a design flaw since self driving cars are required to drive in accordance with the law (at least in California).
 
...unless someone admits fault...

That's the point!

If I got a flat tire, Tesla is not going to give me a hard time asking me why I caused a flat tire. It would send a Roadside Assistance out for free if I am still under the warranty.

If Tesla does not want to send a Roadside Assistance, I might have to go to court and sue and get the result years later! It's a design flaw because with cheaper cars, I could have a spare tire and included equipment to take care of the issue myself without the need of Roadside Assistance.

And that's the same with FSD liability.

If your own driverless Tesla FSD Robotaxi in the future might hit a stationary vehicle in front, Tesla is not currently promising that it would have a blanket policy to cover FSD accidents. You'll need to go through the normal route right now: get a claim from your insurance then go to court to get compensated from Tesla years later.

The problem of going to court is: you need to have a good lawyer because in the case of OJ Simpson, it was a wrong move to have him put on the very small glove that didn't fit him and lost the whole case for "If it doesn't fit, you must acquit!"

Thus, we need a stronger and consumer-friendly law for Autonomous Vehicles without going to court in a Tesla FSD accident.
 
That's the point!

If I got a flat tire, Tesla is not going to give me a hard time asking me why I caused a flat tire. It would send a Roadside Assistance out for free if I am still under the warranty.

If Tesla does not want to send a Roadside Assistance, I might have to go to court and sue and get the result years later! It's a design flaw because with cheaper cars, I could have a spare tire and included equipment to take care of the issue myself without the need of Roadside Assistance.

And that's the same with FSD liability.

If your own driverless Tesla FSD Robotaxi in the future might hit a stationary vehicle in front, Tesla is not currently promising that it would have a blanket policy to cover FSD accidents. You'll need to go through the normal route right now: get a claim from your insurance then go to court to get compensated from Tesla years later.

The problem of going to court is: you need to have a good lawyer because in the case of OJ Simpson, it was a wrong move to have him put on the very small glove that didn't fit him and lost the whole case for "If it doesn't fit, you must acquit!"

Thus, we need a stronger and consumer-friendly law for Autonomous Vehicles without going to court in a Tesla FSD accident.
So the court is going to find that owner of the car, who is not in the vehicle or driving it, is at fault? That makes no sense.
Anyway, I'm sure this will all be clarified by regulations and if the self-driving system is at fault in an accident then the manufacturer will be liable. I think auto companies trying to lobby for any other system is silly.
 
..That makes no sense...

Europe seems to develop a more consumer-friendly law to hold manufacturers liable if the autonomous car doesn't function as advertised such as rear-ending a car in front in clear, optimal driving conditions and perfectly good weather.

However, in the US, most likely, it will protect autonomous vehicle manufacturers if consumers don't make noise right now.

From Tesla's standpoint, it's just like owning an automatic elevator.

Elevator used to be manually operated by an owner's employee called Elevator Operator or Liftman or Lift Attendant.

If something goes wrong, that employee might be fired and the owner can be held liable and covered by the insurance first. Then, it is up to the owner to go to court to get any compensation from the elevator manufacturer.

Now, elevators are no longer manually operated by an employee. They are all automatic. There's no Elevator Operator to be fired anymore but there's still an owner with insurance in case if there's an elevator accident.

Tesla believes that's the same with cars. Drivers can be held liable for now, but in the future, there's no more driver to hold them liable but there's still owners who own those autonomous vehicles.

Not just accidents but what about tickets for running red lights, rolling stop sign, overspeeding, wreckless Auto Lane Change... Those tickets will have no drivers to sign for but there's a license plate for redlight cameras and others to send to the owner of the car (not the car's manufacturer).

If consumers don't raise any noise, it'll be quite bad as it already happened in Dallas area when it's almost impossible to hold anyone liable as a driver sought about $2,600 damage compensation caused by a tiny autonomous food delivery vehicle.

NBC 5 Responds: What Happens if You're in a Crash With a Robot?

The driver tried to get the process going with the police but they doubt whether that autonomous food delivery vehicle a "vehicle" because of its size to fit the definition of the current law.

The city gave permission for that operation but gave her the silence treatment.

The only contact info is painted on the vehicle as:

starship.-asu.com

but its IP address is now invalid.

Eventually, through its main website, the company said it's not its fault.

Only when the driver got the TV News team involved, the company started to process the claim and eventually compensated the requested amount to the driver.

It's the law of the Wild Wild West right now in the US for Autonomous Vehicles Liability.

Consumers need to get involve and help to write up the law!
 
Now, elevators are no longer manually operated by an employee. They are all automatic. There's no Elevator Operator to be fired anymore but there's still an owner with insurance in case if there's an elevator accident.

Tesla believes that's the same with cars. Drivers can be held liable for now, but in the future, there's no more driver to hold them liable but there's still owners who own those autonomous vehicles.
I wonder if this is really true. Are there really cases of design defects in elevators where the manufacturer was not liable? Why are elevators different from other products?
Tesla does not make autonomous vehicles yet, only driver assist systems. With regards to flaws in FSD Elon Musk did say "If it is something endemic to our design, certainly we would take our responsibility for that." That seems to say that Tesla would have liability if the vehicle gets in an at fault accident.
 
...Tesla does not make autonomous vehicles yet, only driver assist systems...

We are talking about laws that will govern Autonomous Vehicles liability which we have zero both in laws and machine. So the question is should we wait until something happened in Dallas to begin to think about it?

...I wonder if this is really true. Are there really cases of design defects in elevators where the manufacturer was not liable? Why are elevators different from other products?...

Manufacturers are liable for what they make either an elevator or an Autonomous Vehicle.

The issue here is: How to hold them liable? Should the law allow manufacturers to fight in court first for decades after we die?

Current US law and US proposed law do not automatically hold Autonomous Vehicle manufacturers liable if there's an accident. Victims need to present their case in court in order to prove that there's a design flaw.

Europe proposed law assumes Autonomous Vehicle manufacturers are automatically liable if there is an accident where their own specifications say there should not be. For example, LIDAR reliably detects obstacles and the software are written to brake for them. If it didn't brake and then it rear-ended the car in front, Europe proposed law assumes Autonomous Vehicle manufacturers are automatically liable but the current US law is just like in Dallas: It's the victim's problem and good luck to find a TV reporter to help you out!

Elon Musk: Tesla not liable for driverless car crashes unless it's design related
 
Last edited:
Current US law and US proposed law do not automatically hold Autonomous Vehicle manufacturers liable if there's an accident. Victims need to present their case in court in order to prove that there's a design flaw.
Why would they be automatically liable? If I run someone over with my car I'm not automatically liable, the victim, or state if it's a criminal trial, has to prove in court that I am at fault and therefore liable. I don't see how it's different if it's done by an autonomous vehicle. An autonomous vehicle in order to be allowed on the road, at least in California, must obey the vehicle code. If it gets into an at-fault accident then it did not obey the vehicle code and that is a design flaw.
It's right there in the form you use to register an autonomous vehicle in CA:
Screen Shot 2020-10-01 at 1.31.47 PM.png

It just seems like the standard of proof would be the same since an at-fault accident is a design flaw. Now I suppose you could argue that the vehicle was designed to comply with the vehicle code but failed and therefore it's not a design flaw. I don't know, I'm not a lawyer. :p

I admit that things gets trickier when we're talking about Level 3 vehicles where there need to be rules about the driver's responsibility.
 
Wow! I was for a moment so happy that I hadn't paid $7000 for FSD when I saw this offer, and tempted to pay the $4k for the features I thought I wanted when I bought my car, but didn't think they were worth it. But then I thought for a while and realized that even this wasn't worth $4000 any more. I've driven my car for 5000 miles now with basic autopilet and whlie occasionally annoyed that I have to manually change lanes, that certainly isn't worth $4k. My previous car had EAP and I never used summon, autopark, or NOA. When I bought my car hands down I would have sprung for this $4k option (in fact I was very tempted by the then $7, FSD option, in part because of the rising price threats), but now, I'm glad I didn't, and am no longer tempted even by EAP at the lower price. Also to me it seems if I had sprung for FSD because I liked these EAP features, this would leave a bad taste in my mouth.
 
  • Informative
Reactions: GeoX750
...Why would they be automatically liable?...

Because they need to buy trust from customers.

When Tesla decided against a spare tire, it wants to boost customers' confidence by assuming the cost of Roadside Assistance and towing for a flat tire. It's automatic. There's no need to go to court to argue in order to get coverage years down the road.

...I admit that things gets trickier when we're talking about Level 3 vehicles where there need to be rules about the driver's responsibility....

For L3 and above accidents?

Volvo CEO: We will accept all liability when our cars are in autonomous mode.

"In an attempt to address one area of uncertainty, Audi says it will assume liability for any accidents that happen when its automated driving technology is in use."

Audi Goes Full Speed on Automated Driving, While Competitors Wary of Liability, Regulation
 
Because they need to buy trust from customers.

When Tesla decided against a spare tire, it wants to boost customers' confidence by assuming the cost of Roadside Assistance and towing for a flat tire. It's automatic. There's no need to go to court to argue in order to get coverage years down the road.



For L3 and above accidents?

Volvo CEO: We will accept all liability when our cars are in autonomous mode.

"In an attempt to address one area of uncertainty, Audi says it will assume liability for any accidents that happen when its automated driving technology is in use."

Audi Goes Full Speed on Automated Driving, While Competitors Wary of Liability, Regulation
If Volvo's software is driving my car then of course they're liable. I guess it's good that they're acknowledging it. As far as I'm concerned that's also Tesla's view except Musk clarified that they're only liable for accidents where the system is at fault (which I'm sure is also the position of Volvo and Audi).
For L3 vehicles it's more complicated because you need to determine how much time is reasonable for the driver to take over and you can't have the car turn over control 1 foot from a brick wall and then blame the driver for crashing into it. That's going to require additional regulation which I assume is what they're working on in Europe and Korea.
 
...As far as I'm concerned that's also Tesla's view except Musk clarified that they're only liable for accidents where the system is at fault (which I'm sure is also the position of Volvo and Audi)...

Tesla has a very different approach. Tesla does not automatically take responsibility so you have to prove that it's "endemic".

So, if there were only 3 fatal Autopilot accidents so far when the rest of the fleet (us) haven't, is that "endemic"?

There were only 2 fatal Boeing 737 Max crashes, would that be "endemic"?

...For L3 vehicles it's more complicated because you need to determine how much time is reasonable for the driver to take over and you can't have the car turn over control 1 foot from a brick wall and then blame the driver for crashing into it...

Audi Traffic Jam Pilot is designed as L3 and Audi automatically assumes liability when the user allows the system to cause an accident (That means, if a user saw that the system is about to collide so the user intervened to avoid an accident, then it's the user's responsibility).
 
Tesla has a very different approach. Tesla does not automatically take responsibility so you have to prove that it's "endemic".

So, if there were only 3 fatal Autopilot accidents so far when the rest of the fleet (us) haven't, is that "endemic"?

There were only 2 fatal Boeing 737 Max crashes, would that be "endemic"?
Musk was talking about a future autonomous mode not a driver assist system. With a driver assist system Tesla does not accept liability because they are not legally the driver of the car. I admit I don't really understand his use of the word "endemic" since all design flaws are "endemic."
The design flaw in the Boeing 737 Max is endemic since it's in every 737 Max built. They will all respond exactly the same way to a failure of a single angle of attack sensor.
Audi Traffic Jam Pilot is designed as L3 and Audi automatically assumes liability when the user allows the system to cause an accident (That means, if a user saw that the system is about to collide so the user intervened to avoid an accident, then it's the user's responsibility).
I think if the car gets in a situation where an accident is unavoidable, the driver takes over manually, and Audi denies liability that would be something that would be litigated. There will always be lawsuits for ambiguous cases.
 
...I think if the car gets in a situation where an accident is unavoidable, the driver takes over manually, and Audi denies liability that would be something that would be litigated.

Audi's position is very simple: If the automation system (L3 and above) is active during an accident then Audi would automatically pick up the tab.

Notice that L3 is not perfect so it still needs human MANUAL actions from time to time.

It's the MANUAL part that Audi does not cover because the driver, not the automation is operating at that time.

...There will always be lawsuits for ambiguous cases.

That's what the law is for: to reduce lawsuits.

Most people or companies would follow the law but some would prefer to fight with a lawsuit.

For example, most would follow the law to drive in the correct direction, but some might be ticketed for going the wrong direction and still want to challenge that in court.
 
...."endemic"...

Endemic means pervasive, popular, mostly, in the majority, most of the time.

That's why Tesla can argue in court that the deaths are not endemic.

And that's why before the Boeing 737 Max was grounded in the US, the US was arguing that only 2 crashes, not endemic, too few to be panic. Only after Canada grounded it not because of too few cases but because it got essentially the same crash patterns from those 2, that's when the US followed.
 
Audi's position is very simple: If the automation system (L3 and above) is active during an accident then Audi would automatically pick up the tab.

Notice that L3 is not perfect so it still needs human MANUAL actions from time to time.

It's the MANUAL part that Audi does not cover because the driver, not the automation is operating at that time.
Of course but there is a hand off from the car to the driver. That's the main thing that requires regulation. Pretty sure Audi is not going to pick up the tab for accidents that are not due to a flaw in the system though (i.e. not their fault).

For Level 4-5 systems the system is in control of the car and no driver is required. That makes liability simple.
 
For L3 vehicles it's more complicated because you need to determine how much time is reasonable for the driver to take over and you can't have the car turn over control 1 foot from a brick wall and then blame the driver for crashing into it. That's going to require additional regulation which I assume is what they're working on in Europe and Korea.

Audi Traffic Jam Pilot is designed as L3 and Audi automatically assumes liability when the user allows the system to cause an accident (That means, if a user saw that the system is about to collide so the user intervened to avoid an accident, then it's the user's responsibility).

Audi's Traffic Jam Pilot only works on divided highways and at speeds less than 37 mph. Those are rather limited conditions. The only times you'd probably be able to use this system would be in stop and go traffic on the highway. You don't have cross traffic to worry about. You probably won't have pedestrians to worry about either since you are on a highway in traffic. You are travelling at low speed. The car is probably stuck between cars in each adjacent lane and cars in front. You really just have to worry about staying in the lane, following the car in front and cars trying to cut in. And again, you are going pretty slow so you have more time to react. And presumably, Audi feels confident enough that the system can handle traffic jams conditions with high reliability to be L3. So I think Audi deliberately constrained the system to conditions with low risk of accidents. So, Audi is more willing to accept liability since they consider the risks to be low.

The L3 system can only be on in those low speed traffic jam conditions. Under those conditions, yes, Audi accepts all liability. The only time the driver takes over again, is when the car is about to leave those traffic jam conditions, ie traffic jam is clearing up and car is accelerating and is about to exceed 37 mph. And when the driver does take over, then the driver is liable again, since the L3 system is off at that point.
 
  • Like
Reactions: Tam
Endemic means pervasive, popular, mostly, in the majority, most of the time.

That's why Tesla can argue in court that the deaths are not endemic.

And that's why before the Boeing 737 Max was grounded in the US, the US was arguing that only 2 crashes, not endemic, too few to be panic. Only after Canada grounded it not because of too few cases but because it got essentially the same crash patterns from those 2, that's when the US followed.
Musk said "endemic to our design" though not that the occurrence of accidents caused by the design are endemic.
Now it appears from a Google search that Musk is the only person to ever use those four words in that order so I guess it's open to interpretation. haha. It really doesn't make much sense.
 
..."endemic to our design" though not that the occurrence of accidents caused by the design are endemic...

The whole sentence:

"If it is something endemic to our design, certainly we would take our responsibility for that."

That "something" can be deaths, injuries, accidents, collisions...

For example: If it is the fatality rate endemic to our design, certainly we would take our responsibility for that.
 
The whole sentence:

"If it is something endemic to our design, certainly we would take our responsibility for that."

That "something" can be deaths, injuries, accidents, collisions...

For example: If it is the fatality rate endemic to our design, certainly we would take our responsibility for that.
My interpretation is that endemic to just means part of with a bad connotation. So I interpret it as "If it something part of our design, certainly we would take responsibility for that."
He uses it in the context of elevators. OTIS is not liable for all accidents that occur on elevators, only those caused by design flaws. For example if the elevator occasionally free falls from the top floor to the bottom floor because of a software flaw OTIS is liable. If it free falls from the top floor to the bottom floor because the maintenance guy didn't tighten a bolt they are not.
 
...interpretation...

The bottom line is: Tesla does not automatically assume the liability.

Proposed European law wants to make manufacturer liability automatic while it's driven by the autonomous system and not by human.

That's what Volvo's promise is, even while in L3. If an accident is caused by the Traffic Jam Pilot without human driver's manual interference, then it automatically takes responsibility.