The general mass of public is not as conscious and awake as you are. These pictures of kids going to school will tell you the story.Are you willing to send your kids in a L4 vehicle that no one is responsible for the functionality of?
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
The general mass of public is not as conscious and awake as you are. These pictures of kids going to school will tell you the story.Are you willing to send your kids in a L4 vehicle that no one is responsible for the functionality of?
His point is taking over liability is not a requirement for the definition. It's not even required for L3. A manufacturer can declare their car as L3 and still require the driver to retain liability for the vehicle. It's entirely optional for the manufacturer to take liability.
So you get a payout when the machine, that is several magnitudes less safe than a human, happens to kill your kids and that is status quo? Makes sense.That is what your insurance is for. If you don’t want it, don’t buy or use it.Not hard. See?
No one wants to collaborate. They would have to split the profits.We need a universial approach that everyone can work together on to get this done in the next 5 years instead of 20 years. Most every maker has shown demos or prototypes that are capable of some impressive capabilities. Audi for example has some serious hardcore parts on their current 7 and 8 series cars not to mention lab prototypes shown for the last 5-10 years even more impressive.
Everyone is just lagging on development, implementation and sales.
Then don’t use FSD. Is it that hard?So you get a payout when the machine, that is several magnitudes less safe than a human, happens to kill your kids and that is status quo? Makes sense.Thank god for insurance.
We were discussing autonomous systems and manufacturer liability. Some of you would apparently get on a plane regardless if it's 1000x more likely to crash than a competitor's plane because you have insurance?Then don’t use FSD. Is it that hard?
We were discussing autonomous systems and manufacturer liability. Some of you would apparently get on a plane regardless if it's 1000x more likely to crash than a competitor's plane because you have insurance?
100% agree. You're making my point. People will not buy or ride in an autonomous car that doesn't come with manufacturer/operator guarantees in form for example of transparent statistics and submitting to formal third party tests. Regardless of insurance. Companies that launch shitty autonomous (eg eyes off) systems cannot reinsure and will go bankrupt faster than one can say "FSD".Your thinking is totally going in the wrong direction.
People will not fly an airline if they are 1000x more likely to fail, regardless of what insurance they provide. Conversely they will fly the safest airline even if no insurance is provided.
So if Tesla FSD is proven safer and Tesla does not provide any type of insurance, do you believe people will reject Tesla?100% agree. You're making my point. People will not buy or ride in an autonomous car that doesn't come with manufacturer/operator guarantees in form for example of transparent statistics and submitting to formal third party tests. Regardless of insurance. Companies that launch shitty autonomous (eg eyes off) systems cannot reinsure and will go bankrupt faster than one can say "FSD".
You’re confused. You do not need insurance as a taxi passenger. You need insurance as a taxi driver. Tesla is driving and operating the taxi, and if it’s not functioning Tesla will be sued into oblivion. No one will reinsure Tesla if they have a faulty product.So if Tesla FSD is proven safer and Tesla does not provide any type of insurance, do you believe people will reject Tesla?
It seems you are confused.You’re confused. You do not need insurance as a taxi passenger. You need insurance as a taxi driver. Tesla is driving and operating the taxi, and if it’s not functioning Tesla will be sued into oblivion. No one will reinsure Tesla if they have a faulty product.
This is all hypothetical of course as computer vision alone will likely never suffice for autonomy.
I’m saying that AV system providers will need to deliver a safe product or they will be sued.
The riders of an AV do not need insurance for driving, since they are never driving. They are riding.
I the real world where I live, personally owned vehicles will not become robotaxis this decade.It seems you are confused.
Tesla will sell you a vehicle with FSD. If you want to operate that as a robotaxi, you will need to insure the vehicle and passengers like any commercial taxi. Tesla is not going to assume liability and provide insurance.
So all the Yellow Cab Crown Victoria Taxis are insured by Ford?I the real world where I live, personally owned vehicles will not become robotaxis this decade.
If you sell a L3+ product and advertise it as L3+ you will be legally liable for when the system operates in that mode regardless if you say it or not. This discussion was about manufacturer guarantees nor insurance.
Regardless, the provider if the system will end up taking the cost for accidents caused by the system in that mode.
Are Ford driving those vehicles?So all the Yellow Cab Crown Victoria Taxis are insured by Ford?
and you keep missing the point that YOU, the owner, have the choice of using FSD, or not. Tesla is not driving it. Tesla has trained your FSD computer. The FSD computer installed in your tesla car, owned by you, is driving it, upon your express wish to use it.You keep talking about insurance. I am not talking about insurance. I am talking about manufacturer liability and the fact that if you claim that your system does the full OEDR then you're implicitly liable for driving.
Are Ford driving those vehicles?
My point of view here is that any product that is SAE J3016 L3+ comes with certain guarantees, either explicitly or implicitly (civil suits). If you understand J3016 you understand what the DDT, the ODD and the OEDR is, and then you would also understand that the system is 100% driving in these modes and that the human in the car is not.and you keep missing the point that YOU, the owner, have the choice of using FSD, or not. Tesla is not driving it. Tesla has trained your FSD computer. The FSD computer installed in your tesla car, owned by you, is driving it, upon your express wish to use it.
I think the point is that's irrelevant to legal liability though. There's plenty of cases where people have sued Tesla over AP, even though as L2 by definition it's never responsible for the driving. In the US, it's common to sue any tangentially related party. Even if your car is in L3 mode, you are almost guaranteed to still be sued for it if something happens (especially if you are financially well off, as a top end Mercedes owner might be). So as a differentiator of L2 vs L3+, legal liability of the vehicle manufacturer does not necessarily play a huge differentiating factor.My point of view here is that any product that is SAE J3016 L3+ comes with certain guarantees, either explicitly or implicitly (civil suits). If you understand J3016 you understand what the DDT, the ODD and the OEDR is, and then you would also understand that the system is 100% driving in these modes and that the human in the car is not.
I mean sure, it's a complex issue, but ultimately you can't be convicted for man-slaughter if you are a rider and not a driver, which you are regardless of autonomous mode (3,4,5).I think the point is that's irrelevant to legal liability though. There's plenty of cases where people have sued Tesla over AP, even though as L2 by definition it's never responsible for the driving. In the US, it's common to sue any tangentially related party. Even if your car is in L3 mode, you are almost guaranteed to still be sued for it if something happens (especially if you are financially well off, as a top end Mercedes owner might be). So as a differentiator of L2 vs L3+, legal liability of the vehicle manufacturer does not necessarily play a huge differentiating factor.
In a robotaxi situation, if the entity producing the autonomous system is different from the operating the local mobility service, I am sure there will be legalese for sorting this out between the parties.But for things like robotaxis, generally the owner/operator is the primary legal party responsible. For example for Waymo, generally they are responsible, not Jaguar, FCA, or Magna even though they are the ones that manufactured the vehicles. If self driving systems get commoditized, with many off the shelf systems available, I can totally see a legal framework set where owner/operators are the ones that take primary responsibility for when things go wrong.
No one is talking about the rider being liable, but about the owner or the driver (if there is a take over driver as in L3).I mean sure, it's a complex issue, but ultimately you can't be convicted for man-slaughter if you are a rider and not a driver, which you are regardless of autonomous mode (3,4,5).
The robot owner might not be charged with murder (neither will any party necessarily), but it's easy to see the owner taking at least shared legal responsibility for its actions. Here's an paper that discusses this, bringing up owners of dogs that attack people may take legal responsibility for its action (not its original trainer nor breeder).In a robotaxi situation, if the entity producing the autonomous system is different from the operating the local mobility service, I am sure there will be legalese for sorting this out between the parties.
Imagine a personal robot helping in the home that one bought from an off-the-shelf robot vendor with a service and software update plan, and it suddenly unlocked the safe and grabbed a gun killing everyone in sight. I don't think it's resonable to charge the robot owner with murder. That would be a manufacturer legal issue imho.![]()
Is the system is in autonomous mode in L3 the system is driving and the human is not. The human is only driving at the system's request (typically when it leaves the ODD) and after a completed handover procedure. There is no driver in autonomous mode. The system is the driver.No one is talking about the rider being liable, but about the owner or the driver (if there is a take over driver as in L3).