Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Mercedes approved for ACTUAL self driving in the USA. And will accept responsibility.

This site may earn commission on affiliate links.
Are you willing to send your kids in a L4 vehicle that no one is responsible for the functionality of?
The general mass of public is not as conscious and awake as you are. These pictures of kids going to school will tell you the story.

IMG_7072.jpeg

IMG_7073.jpeg
 
  • Funny
Reactions: pilotSteve
We need a universial approach that everyone can work together on to get this done in the next 5 years instead of 20 years. Most every maker has shown demos or prototypes that are capable of some impressive capabilities. Audi for example has some serious hardcore parts on their current 7 and 8 series cars not to mention lab prototypes shown for the last 5-10 years even more impressive.

Everyone is just lagging on development, implementation and sales.
No one wants to collaborate. They would have to split the profits.
If non-profit companies had these resources, maybe. But even those don't want to share the credit.

If our society actually cared about the overall advancement of EVERYONE, we'd already have the cure for cancer, cold-fusion, and vacation resorts on Mars.

I'm a hyper capitalist, but the greed is what holds back humanity.

At the same time, in the system we have now, it's the competition that pushes for innovations at all. If we allowed monopolies, the spark for advancement just dies. So it is good that there are multiple companies in the autonomy race. They push each other.
 
  • Like
Reactions: pilotSteve
We were discussing autonomous systems and manufacturer liability. Some of you would apparently get on a plane regardless if it's 1000x more likely to crash than a competitor's plane because you have insurance?


Your thinking is totally going in the wrong direction.

People will not fly an airline if they are 1000x more likely to fail, regardless of what insurance they provide. Conversely they will fly the safest airline even if no insurance is provided.
 
Last edited:
  • Like
Reactions: pilotSteve
Your thinking is totally going in the wrong direction.

People will not fly an airline if they are 1000x more likely to fail, regardless of what insurance they provide. Conversely they will fly the safest airline even if no insurance is provided.
100% agree. You're making my point. People will not buy or ride in an autonomous car that doesn't come with manufacturer/operator guarantees in form for example of transparent statistics and submitting to formal third party tests. Regardless of insurance. Companies that launch shitty autonomous (eg eyes off) systems cannot reinsure and will go bankrupt faster than one can say "FSD".
 
Last edited:
100% agree. You're making my point. People will not buy or ride in an autonomous car that doesn't come with manufacturer/operator guarantees in form for example of transparent statistics and submitting to formal third party tests. Regardless of insurance. Companies that launch shitty autonomous (eg eyes off) systems cannot reinsure and will go bankrupt faster than one can say "FSD".
So if Tesla FSD is proven safer and Tesla does not provide any type of insurance, do you believe people will reject Tesla?
 
So if Tesla FSD is proven safer and Tesla does not provide any type of insurance, do you believe people will reject Tesla?
You’re confused. You do not need insurance as a taxi passenger. You need insurance as a taxi driver. Tesla is driving and operating the taxi, and if it’s not functioning Tesla will be sued into oblivion. No one will reinsure Tesla if they have a faulty product.

This is all hypothetical of course as computer vision alone will likely never suffice for autonomy.

I’m saying that AV system providers will need to deliver a safe product or they will be sued.

The riders of an AV do not need insurance for driving, since they are never driving. They are riding.
 
Last edited:
  • Like
Reactions: pilotSteve
You’re confused. You do not need insurance as a taxi passenger. You need insurance as a taxi driver. Tesla is driving and operating the taxi, and if it’s not functioning Tesla will be sued into oblivion. No one will reinsure Tesla if they have a faulty product.

This is all hypothetical of course as computer vision alone will likely never suffice for autonomy.

I’m saying that AV system providers will need to deliver a safe product or they will be sued.

The riders of an AV do not need insurance for driving, since they are never driving. They are riding.
It seems you are confused.

Tesla will sell you a vehicle with FSD. If you want to operate that as a robotaxi, you will need to insure the vehicle and passengers like any commercial taxi. Tesla is not going to assume liability and provide insurance.
 
  • Informative
Reactions: pilotSteve
It seems you are confused.

Tesla will sell you a vehicle with FSD. If you want to operate that as a robotaxi, you will need to insure the vehicle and passengers like any commercial taxi. Tesla is not going to assume liability and provide insurance.
I the real world where I live, personally owned vehicles will not become robotaxis this decade.

If you sell a L3+ product and advertise it as L3+ you will be legally liable for when the system operates in that mode regardless if you say it or not. This discussion was about manufacturer guarantees nor insurance.

Regardless, the provider if the system will end up taking the cost for accidents caused by the system in that mode.
 
I the real world where I live, personally owned vehicles will not become robotaxis this decade.

If you sell a L3+ product and advertise it as L3+ you will be legally liable for when the system operates in that mode regardless if you say it or not. This discussion was about manufacturer guarantees nor insurance.

Regardless, the provider if the system will end up taking the cost for accidents caused by the system in that mode.
So all the Yellow Cab Crown Victoria Taxis are insured by Ford?

Of course, if your local regulations requires it, you will need to establish a commercial entity and assign your Tesla as a robotaxi under that entity, and if FSD is proven safe and reliable, the insurance company will give you a special discount.
 
You keep talking about insurance. I am not talking about insurance. I am talking about manufacturer liability and the fact that if you claim that your system does the full OEDR then you're implicitly liable for driving.


Are Ford driving those vehicles?
and you keep missing the point that YOU, the owner, have the choice of using FSD, or not. Tesla is not driving it. Tesla has trained your FSD computer. The FSD computer installed in your tesla car, owned by you, is driving it, upon your express wish to use it.
 
  • Like
Reactions: pilotSteve
and you keep missing the point that YOU, the owner, have the choice of using FSD, or not. Tesla is not driving it. Tesla has trained your FSD computer. The FSD computer installed in your tesla car, owned by you, is driving it, upon your express wish to use it.
My point of view here is that any product that is SAE J3016 L3+ comes with certain guarantees, either explicitly or implicitly (civil suits). If you understand J3016 you understand what the DDT, the ODD and the OEDR is, and then you would also understand that the system is 100% driving in these modes and that the human in the car is not.
 
  • Helpful
Reactions: pilotSteve
My point of view here is that any product that is SAE J3016 L3+ comes with certain guarantees, either explicitly or implicitly (civil suits). If you understand J3016 you understand what the DDT, the ODD and the OEDR is, and then you would also understand that the system is 100% driving in these modes and that the human in the car is not.
I think the point is that's irrelevant to legal liability though. There's plenty of cases where people have sued Tesla over AP, even though as L2 by definition it's never responsible for the driving. In the US, it's common to sue any tangentially related party. Even if your car is in L3 mode, you are almost guaranteed to still be sued for it if something happens (especially if you are financially well off, as a top end Mercedes owner might be). So as a differentiator of L2 vs L3+, legal liability of the vehicle manufacturer does not necessarily play a huge differentiating factor.

Yes, the early examples coming out, you are seeing manufacturers like Mercedes indemnifying or insuring the owner while it is operating under L3 mode.

But for things like robotaxis, generally the owner/operator is the primary legal party responsible. For example for Waymo, generally they are responsible, not Jaguar, FCA, or Magna even though they are the ones that manufactured the vehicles. If self driving systems get commoditized, with many off the shelf systems available, I can totally see a legal framework set where owner/operators are the ones that take primary responsibility for when things go wrong.
 
  • Like
Reactions: enemji
I think the point is that's irrelevant to legal liability though. There's plenty of cases where people have sued Tesla over AP, even though as L2 by definition it's never responsible for the driving. In the US, it's common to sue any tangentially related party. Even if your car is in L3 mode, you are almost guaranteed to still be sued for it if something happens (especially if you are financially well off, as a top end Mercedes owner might be). So as a differentiator of L2 vs L3+, legal liability of the vehicle manufacturer does not necessarily play a huge differentiating factor.
I mean sure, it's a complex issue, but ultimately you can't be convicted for man-slaughter if you are a rider and not a driver, which you are regardless of autonomous mode (3,4,5).
But for things like robotaxis, generally the owner/operator is the primary legal party responsible. For example for Waymo, generally they are responsible, not Jaguar, FCA, or Magna even though they are the ones that manufactured the vehicles. If self driving systems get commoditized, with many off the shelf systems available, I can totally see a legal framework set where owner/operators are the ones that take primary responsibility for when things go wrong.
In a robotaxi situation, if the entity producing the autonomous system is different from the operating the local mobility service, I am sure there will be legalese for sorting this out between the parties.

Imagine a personal robot helping in the home that one bought from an off-the-shelf robot vendor with a service and software update plan, and it suddenly unlocked the safe and grabbed a gun killing everyone in sight. I don't think it's resonable to charge the robot owner with murder. That would be a manufacturer legal issue imho. :)
 
Last edited:
I mean sure, it's a complex issue, but ultimately you can't be convicted for man-slaughter if you are a rider and not a driver, which you are regardless of autonomous mode (3,4,5).
No one is talking about the rider being liable, but about the owner or the driver (if there is a take over driver as in L3).
In a robotaxi situation, if the entity producing the autonomous system is different from the operating the local mobility service, I am sure there will be legalese for sorting this out between the parties.

Imagine a personal robot helping in the home that one bought from an off-the-shelf robot vendor with a service and software update plan, and it suddenly unlocked the safe and grabbed a gun killing everyone in sight. I don't think it's resonable to charge the robot owner with murder. That would be a manufacturer legal issue imho. :)
The robot owner might not be charged with murder (neither will any party necessarily), but it's easy to see the owner taking at least shared legal responsibility for its actions. Here's an paper that discusses this, bringing up owners of dogs that attack people may take legal responsibility for its action (not its original trainer nor breeder).


Basically there will be cases where you can't assume manufacturer liability. For example, what if you changed the story slightly:
you left the gun somewhere it was easily taken from (same negligence liability as when a child might shoot someone), or even you handed the gun to the robot, or directed the robot to shoot someone or tricked it to do so (it mistakes the gun for some other non-lethal tool, for example a spray bottle used for cleaning). As such, the owner really has lots of liability, can't just assume the manufacturer is the default party.
 
  • Like
Reactions: enemji
No one is talking about the rider being liable, but about the owner or the driver (if there is a take over driver as in L3).
Is the system is in autonomous mode in L3 the system is driving and the human is not. The human is only driving at the system's request (typically when it leaves the ODD) and after a completed handover procedure. There is no driver in autonomous mode. The system is the driver.

I don't think the dog case is relevant to robotics and SDC at all. Different jurisdictions will handle this differently for sure. If the rider or owner if negligent by pulling the wheel or not performing service on the car, that's another story. Let's assume this isn't the case.

I see self driving cars closer to the domain of malfunctioning products that hurt people even though they are operated as designed.