Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What will Tesla do, when FSD has serious accident?

This site may earn commission on affiliate links.

Matias

Active Member
Apr 2, 2014
4,002
5,918
Finland
Serious accident with FSD will happen, when FSD makes catastrophic mistake and driver does not correct it fast enough. For serious accident with FSD not to happen either FSD should never make catastrophic mistakes or FSD users should be perfect. We know that neither of those presumptions are true.

FSD will now be distributed to 100k cars. Serious accident will happen sooner or later.

What will Tesla do, when that happens?
 
Serious accident with FSD will happen, when FSD makes catastrophic mistake and driver does not correct it fast enough. For serious accident with FSD not to happen either FSD should never make catastrophic mistakes or FSD users should be perfect. We know that neither of those presumptions are true.

FSD will now be distributed to 100k cars. Serious accident will happen sooner or later.

What will Tesla do, when that happens?

Tesla will do exactly what they have done with AP accidents. They will blame the driver. And the driver will be responsible for the accident. That's because Tesla will say that FSD is L2 and therefore the driver was responsible for driving. Tesla will point to statements that the system is not fully autonomous and that the driver must supervise at all times. It won't matter that FSD can do entire trips with zero intervention, the driver is still responsible.
 
There is a standard in the law that even if you are in the right, if you had the opportunity to avoid an accident and did not exercise it, you are at fault to varying degrees. Therefore, if a licensed driver was in the self-driving car at the time of the crash, then that driver could be considered negligent for not preventing the crash.

But as @diplomat33 noted, FSD is a level 2 system and by definition you are required to monitor and take over when needed.
 
  • Helpful
Reactions: Silicon Desert
The real question is....who will Mercedes blame?

Mercedes' system is L3 so Mercedes is liable. Of course, they could try to blame someone else but they would probably lose.

There is a standard in the law that even if you are in the right, if you had the opportunity to avoid an accident and did not exercise it, you are at fault to varying degrees. Therefore, if a licensed driver was in the self-driving car at the time of the crash, then that driver could be considered negligent for not preventing the crash.

I think it would all depend on the SAE level of the system. It is definitely true for automated driving systems that are L2 or less since the driver is still responsible for all the driving even when the car is "self-driving". But I don't believe it is true for automated driving systems that are L4 and above. L3 is a bit of a grey area IMO. For L4 and above, the vehicle is responsible for all driving tasks and therefore would be expected to avoid accidents, within reason, without the need for any human intervention. So I don't think the driver would be expected to intervene for L4 and above.

Of course, we don't have a lot of experience with autonomous driving accidents yet so there is not really any legal precedent AFAIK. It is conceivable that a court could rule that even though the L4 car was responsible for avoiding the accident, the human still had a duty to intervene if they had a reasonable opportunity to prevent the accident. But that would totally weaken the whole concept of autonomous driving. The whole idea of fully autonomous driving is that the human is only a passenger with no responsibilities to do any driving tasks. it should also be noted that there could be nobody in the driver seat at all in a L4 vehicle, like we see with Waymo's driverless rides. Certainly, if there was no human in the driver seat at the time of a crash with a L4 vehicle, I don't think you could expect the human, who was in the back seat, to intervene. The burden is clearly placed on the L4 vehicle to prevent crashes if possible.

That is why the SAE levels matter and why defining "self-driving" matters. There is a big difference between a car that is "partial self-driving", like L2 or L3, where the human would still be expected to intervene and a car that is true "full self-driving", like L4 and above, where the human is a passenger with no driving responsibilities at all.
 
Last edited:
Mercedes' system is L3 so Mercedes is liable. Of course, they could try to blame someone else but they would probably lose.



I think it would all depend on the SAE level of the system. It is definitely true for automated driving systems that are L2 or less since the driver is still responsible for all the driving even when the car is "self-driving". But I don't believe it is true for automated driving systems that are L4 and above. L3 is a bit of a grey area IMO. For L4 and above, the vehicle is responsible for all driving tasks and therefore would be expected to avoid accidents, within reason, without the need for any human intervention. So I don't think the driver would be expected to intervene for L4 and above.

Of course, we don't have a lot of experience with autonomous driving accidents yet so there is not really any legal precedent AFAIK. It is conceivable that a court could rule that even though the L4 car was responsible for avoiding the accident, the human still had a duty to intervene if they had a reasonable opportunity to prevent the accident. But that would totally weaken the whole concept of autonomous driving. The whole idea of fully autonomous driving is that the human is only a passenger with no responsibilities to do any driving tasks. it should also be noted that there could be nobody in the driver seat at all in a L4 vehicle, like we see with Waymo's driverless rides. Certainly, if there was no human in the driver seat at the time of a crash with a L4 vehicle, I don't think you could expect the human, who was in the back seat, to intervene. The burden is clearly placed on the L4 vehicle to prevent crashes if possible.

That is why the SAE levels matter and why defining "self-driving" matters. There is a big difference between a car that is "partial self-driving", like L2 or L3, where the human would still be expected to intervene and a car that is true "full self-driving", like L4 and above, where the human is a passenger with no driving responsibilities at all.
I think a lot might depend on who owns the vehicle...if you hail down a Robotaxis, then the taxi company could be responsible...but L5 in your own car (unless it’s so far in the future that it doesn’t have controls) you might be liable
 
I think a lot might depend on who owns the vehicle...if you hail down a Robotaxis, then the taxi company could be responsible...but L5 in your own car (unless it’s so far in the future that it doesn’t have controls) you might be liable

Like I mentioned above, the definition of L5 implies that you are just a passenger and the vehicle is responsible for all the driving. So I don't think it would make sense if the driver were still liable while the L5 was driving. The only way, the owner could be liable is if they tampered with the L5 in some way or turned off L5 right before the accident. And yes, some L4 and L5 vehicles will not have any controls at all.
 
Serious accident with FSD will happen, when FSD makes catastrophic mistake and driver does not correct it fast enough. For serious accident with FSD not to happen either FSD should never make catastrophic mistakes or FSD users should be perfect. We know that neither of those presumptions are true.

FSD will now be distributed to 100k cars. Serious accident will happen sooner or later.

What will Tesla do, when that happens?
Nothing. The human driver is ALWAYS, I repeat ALWAYS, responsible for the vehicle.
 
@diplomat33 Laws will have to be updated as well.
Autonomous vehicles are already legal in your own state and many other jurisdictions.
  • (a) A person may operate a fully autonomous vehicle with the automated driving system engaged without a human driver being present in the vehicle, provided that such vehicle:
    • (1) Unless an exemption has been granted under applicable federal or state law, is capable of being operated in compliance with Chapter 6 of this title and this chapter and has been, at the time of its manufacture, certified by the manufacturer as being in compliance with applicable federal motor vehicle safety standards;
    • (2) Has the capability to meet the requirements of Code Section 40-6-279;
    • (3) Can achieve a minimal risk condition in the event of a failure of the automated driving system that renders that system unable to perform the entire dynamic driving task relevant to its intended operational design domain;
    • (4) (A) Until December 31, 2019, is covered by motor vehicle liability coverage equivalent to 250 percent of that which is required under:
      • (i) Indemnity and liability insurance equivalent to the limits specified in Code Section 40-1-166; or
      • (ii) Self-insurance pursuant to Code Section 33-34-5.1 equivalent to, at a minimum, the limits specified in Code Section 40-1-166; and
        • (B) On and after January 1, 2020, is covered by motor vehicle liability coverage equivalent to, at a minimum:
          • (i) Indemnity and liability insurance equivalent to the limits specified in Code Section 40-1-166; or
          • (ii) Self-insurance pursuant to Code Section 33-34-5.1 equivalent to, at a minimum, the limits specified in Code Section 40-1-166; and
    • (5) Is registered in accordance with Code Section 40-2-20 and identified on such registration as a fully autonomous vehicle or lawfully registered outside of this state.
 
Serious accident with FSD will happen, when FSD makes catastrophic mistake and driver does not correct it fast enough. For serious accident with FSD not to happen either FSD should never make catastrophic mistakes or FSD users should be perfect. We know that neither of those presumptions are true.

FSD will now be distributed to 100k cars. Serious accident will happen sooner or later.

What will Tesla do, when that happens?

They will keep trying to make FSD better.