Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
This site may earn commission on affiliate links.
IMO, the AP team is more realistic than Elon - who continued (continues?) to believe robotaxi is around the corner. He even talked about it as the next vehicle on ER that got very cold reception (and SP dropped).

If Elon continues to say end of the year FSD will be ready - one of these years, it might come true !

My personal belief is that FSD will get very good by 2025 (1 disengagement in 100 miles like NOA) and possibly autonomous (1 disengagement in 10k miles) by the end of the decade.

It's also relevant what kinds of roads it can operate on. Robotaxi would, presumably, have to be able to drive on any city street. From the videos I've watched here, FSD is decades away from being able to drive safely on South Kihei Road. This is one of only two north-south roads that run all the way through Kihei, the other being Pi'ilani Highway. South Kihei Road is one lane in each direction, and large segments of the north half of the road have no shoulders, and the road is shared with cyclists and pedestrians. You have to be able to see a cyclist coming towards you on the other side of the road and know that the car approaching the cyclist from behind is going to have to move close to or over the center line to avoid hitting the cyclist. Is FSD capable of moving to the side of its lane to give room for vehicles approaching from the opposite direction?

The videos I've seen are on ideal city streets. Many city streets are far from ideal. It's great if FSD can operate on ideal streets with few disengagements. But to be worthy of its name it should be able to operate anywhere a human could. Or at least on any paved street that a human could drive on.
 
Not talking about hardware at all - my prediction is just on software front. I think the gap in hardware is much smaller than the gap in software. Just see all the other companies (like Waymo, Cruise) - they all struggle with $100k worth of hardware and every sensor they want to expand beyond their Geo boundaries. If you see how Tesla FSD performs on routes that people on YT have compared to Waymo & Cruise, there isn't a big difference. So, I'm not convinced much better HW gets anyone much closer to autonomy.
The different/additional hardware will be required to mitigate risk in a generalized robotaxi sufficiently to allow assumption of liability for the DDT by a corporation. There is no way HW3 will be capable of mitigating enough risk especially in certain situations unless those situations are geofenced out, like inclement weather or winter. I have no doubt that Tesla could already have something rolling in Arizona and other specific regions and climates where robotaxis currently operate.

New and better GPS antennae will likely resolve some of the issues we've seen with mapping and routing, also surely necessary for generalized robotaxis.

Processing power and better cameras will likely reduce latency and improve response time to further reduce risk.

If we look at something like ARK's 2026 forecast, I bet they figure HW3 is what will enable human-driven ride hail revenue alongside future robotaxi revenue. Elon throws out numbers like HW3 being 200-300% safer than a human driver, that isn't nearly enough risk mitigation. When this actually happens, any accident resulting in injuries/deaths where the DDT is owned by a corporation will potentially be millions of dollars in legal fees and damages.
 
Elon throws out numbers like HW3 being 200-300% safer than a human driver, that isn't nearly enough risk mitigation. When this actually happens, any accident resulting in injuries/deaths where the DDT is owned by a corporation will potentially be millions of dollars in legal fees and damages.
The robotaxi would have to be at fault though. Achieving 200-300% safer than average is extremely difficult even if there are zero at-fault collisions.
 
The robotaxi would have to be at fault though. Achieving 200-300% safer than average is extremely difficult even if there are zero at-fault collisions.
Yeah I'm still not sure what 200-300% safer is supposed to mean exactly, right now I just take it as driving 200-300% more miles between accidents compared to rough numbers floating around.

There's a lot of interesting liability-related stuff out there discussing all kinds of different wrinkles that could emerge
 
Yeah I'm still not sure what 200-300% safer is supposed to mean exactly, right now I just take it as driving 200-300% more miles between accidents compared to rough numbers floating around.

There's a lot of interesting liability-related stuff out there discussing all kinds of different wrinkles that could emerge

Maybe it means when the FSDb driving fleet is so scared of their wits they are able to drive 200-300% safer than the average driver and/or the FSDb driving fleet disables FSDb for all known challenged scenarios like being in traffic, approaching intersections, turns, merging lanes, etc. Honestly that stat is BS since there's no way for FSDb to self drive.

As Seinfeld might say the most important part of Full Self Driving is doing the Full Self Driving. :)
 
Last edited:
When this actually happens, any accident resulting in injuries/deaths where the DDT is owned by a corporation will potentially be millions of dollars in legal fees and damages.
This is the reason I think any hopes of actually seeing L3 in mainstream consumer cars anytime soon is very limited. The risks are just too great.

2x to 3x safer than humans would just mean having that many at-fault accidents. So, if we take Virginia Tech estimate of 10k for simple accidents (like curbing), that would mean an AV should be 20 to 30k miles per accident or about 5 accidents in the life of a car.

If a car has a $10K margin - that would mean in the life of a car, it is likely that the margin will be wiped out by the accident. That is one of the reasons why FSD needs to cost a LOT more to be actually affordable for the company.
 
  • Like
Reactions: AndrewZ
This is the reason I think any hopes of actually seeing L3 in mainstream consumer cars anytime soon is very limited. The risks are just too great.

2x to 3x safer than humans would just mean having that many at-fault accidents. So, if we take Virginia Tech estimate of 10k for simple accidents (like curbing), that would mean an AV should be 20 to 30k miles per accident or about 5 accidents in the life of a car.

If a car has a $10K margin - that would mean in the life of a car, it is likely that the margin will be wiped out by the accident. That is one of the reasons why FSD needs to cost a LOT more to be actually affordable for the company.

Or it has to be so good that it has far fewer accidents.

In any case, when there's an accident, the insurance company is the one that pays. Not the driver or the car maker. Self-driving cars will be no different. Either the car maker insures the cars, and charges the buyer to compensate (the car costs more but the owner doesn't have to pay for insurance) or the sale contract requires the buyer to pay for insurance, and the car maker doesn't need to charge extra for the accident risk. And if FSD is really safer than a human driver, then the cost of insuring the car will be less.

I suspect that it will become convention, and perhaps law, that the owner is responsible for insuring the car, even if they are not driving. Note that, at present, if somebody else drives your car with your permission, you are still responsible to have the car insured. For insurance and liability purposes, FSD will be like allowing another person to drive your car.
 
In any case, when there's an accident, the insurance company is the one that pays.
No - states now have laws that make the car maker responsible.

In anycase, expect huge lawsuits and payouts. So, either the car maker or their insurer has to work out the financials.

BTW, Tesla might require FSD customers to have Tesla insurance - and say the customer won't have any deductibles when using FSD.
 
  • Like
Reactions: jeewee3000
In any case, when there's an accident, the insurance company is the one that pays. Not the driver or the car maker.
This is false, at least in California. The insurance company only pays up to your coverage limit. If someone sues you for $10 million and wins but your insurance limit is $1 million then you are on the hook for $9 million.
If you’re wealthy buy umbrella insurance!
 
No - states now have laws that make the car maker responsible.

In anycase, expect huge lawsuits and payouts. So, either the car maker or their insurer has to work out the financials.

BTW, Tesla might require FSD customers to have Tesla insurance - and say the customer won't have any deductibles when using FSD.

In that case, the car maker buys insurance and adds it to the price of the car. But with the car maker's insurance covering accidents, the car owners won't need insurance. So they pay more for the car and make it up on insurance. As for $10,000,000 accident claims, the insurance companies will fight them with their teams of lawyers.

It all comes down to how good self-driving technology is. If self-driving cars get so good that they demonstrably save lives, there will be a big push to adopt them, including lobbying from insurance companies for a legal structure that does not punish the car company for making the roads safer!

Of course, we are decades away from having fully-self-driving cars that are that good. I no longer expect to live long enough to see that, though I expect that some here will. If we can hold off a climate catastrophe long enough.
 
This is false, at least in California. The insurance company only pays up to your coverage limit. If someone sues you for $10 million and wins but your insurance limit is $1 million then you are on the hook for $9 million.
If you’re wealthy buy umbrella insurance!

And if the car company buys insurance, and someone sues for $10,000,000, the plaintiff will lose.

But the advice to buy umbrella insurance if you're rich is good. The car companies will buy insurance before selling a car that could expose them to lawsuits.
 
And if the car company buys insurance, and someone sues for $10,000,000, the plaintiff will lose.

But the advice to buy umbrella insurance if you're rich is good. The car companies will buy insurance before selling a car that could expose them to lawsuits.
Insurance has nothing to do with whether or not a lawsuit is successful.
What would make insurance more expensive for AV's is that the coverage limit would have to be very large and juries might be more likely to find machines liable and grant larger settlements. Also, I imagine that the lawsuits themselves would be more expensive and complicated.
 
As an EU insurance attorney I find some of the above postings cringy.

It's important to seperate purchasing a Tesla vehicle (= transfer of ownership from Tesla Inc. to the customer) and the insurance covering said vehicle.

An insurance contract for motor vehicles generally has two elements:
1) the liability is insured, meaning if the vehicle causes damage to others (NOT the insured vehicle or its driver) the insurance policy covers that damage if the person deriving the insured vehicle is at fault. (In the case of vulnerable road users many countries don't require any fault from the driver but the insurers of the involved motor vehicles are automatically/objectively required to pay. The use of a motor vehicle is considered an objective risk to all VRU's, but I digress.)

2) (optional in most countries) damage to the insured vehicle can be insured, even in case of 'at fault' accidents (for example you hit a tree, with no other road users involved). This is not a liability insurance but an object insurance (i.e. you assure the worth of a certain object, like home insurance).

When talking about accidents involving autonomous vehicles, we are talking about part 1: the liability insurance.

Whoever has to cover this liability (Tesla / Tesla Insurance / other insurance), this is just a liability insurance contract like any other.

Tesla legal will not mix this with purchase price. Let's assume Tesla solves FSD and starts the Tesla Network. You can now send your car out in the fleet and earn fares. You bought your Tesla vehicle for $50k and you bought FSD software for $50k (or a subscription, doesn't matter).

The purchase price of the car and the software is to be viewed seperately from the insurance premiums.

It is my belief that at first other insurance companies will not be a good choice to insure Tesla vehicles with autonomous capability:
- either the contract will state that liability is not covered when no driver is behind the wheel, or you are always viewed "at fault" with regard to your insurer, meaning your premiums will be increased;
- or the insurance premiums would be insanely high since these companies won't "trust" Tesla FSD and they'll want to overcharge to minimize risk.

Cue Tesla insurance: they'll not only allow you to use your Tesla in autonomous mode (without a driver), also they'll insure liability of the car in any driving mode. They can offer the most competitive premiums since they have the accident data (fleet wise AND individually). I can imagine Tesla Insurance contracts could encourage the use of FSD since Tesla knows the risks of accident are lower. For example, they could stipulate that you will never be regarded "at fault" (in relation to Tesla Insurance) if FSD was enabled and now interfered with. In other words: if you use FSD you would not risk an increase in insurance premiums. Only if you drive yourself (manually) and you get in an accident "at fault" your insurance rates go up.

That's why I'm pretty sure Tesla will at first limit the vehicles in the Tesla Network to Tesla vehicles insured by Tesla Insurance. This would save Tesla a lot of legal hassle during the growing pains of the Tesla Network).

But to be clear, the autonomous Tesla vehciles would have their liability be insured by an insurance company (Tesla Insurance), not a manufacturer. Tesla Inc. might put both of these under the same umbrella but they are internally (financially) seperated.

Side note: that's when in my mind Tesla cars will be fully autonomous, surpassing human level driving: the moment Tesla insurance covers you even without supervising the software/without anyone in the driver seat.
 
  • Like
Reactions: EVNow
But to be clear, the autonomous Tesla vehciles would have their liability be insured by an insurance company (Tesla Insurance), not a manufacturer. Tesla Inc. might put both of these under the same umbrella but they are internally (financially) seperated.
But Tesla the manufacturer would still be liable for defects in the system (i.e. any at fault collision). There is no way for Tesla to limit their liability unless there are laws passed to cap damages involving autonomous vehicles.
 
But Tesla the manufacturer would still be liable for defects in the system (i.e. any at fault collision). There is no way for Tesla to limit their liability unless there are laws passed to cap damages involving autonomous vehicles.

This is NOT how the law is currently crafted, not nation-wide and not state by state. The liability still lies with the driver, 100%, and Tesla has VERY carefully crafted their EULA to make sure that users are the responsible party, not Tesla. Anyone using Autopilot or FSDb is literally saying "In the event of an accident, even if there might be concern that the system had a degree of fault, I accept FULL responsibility by using this system."

MSM may give a lot of hubub about "autopilot crashes", but you will not see a single ruling against Tesla where they have had to pay out $$$ in this regard.
 
  • Like
Reactions: jerry33
This is NOT how the law is currently crafted, not nation-wide and not state by state. The liability still lies with the driver, 100%, and Tesla has VERY carefully crafted their EULA to make sure that users are the responsible party, not Tesla. Anyone using Autopilot or FSDb is literally saying "In the event of an accident, even if there might be concern that the system had a degree of fault, I accept FULL responsibility by using this system."

MSM may give a lot of hubub about "autopilot crashes", but you will not see a single ruling against Tesla where they have had to pay out $$$ in this regard.
I'm talking about systems where there is no driver such as FSD when it's out of beta.
 
I'm talking about systems where there is no driver such as FSD when it's out of beta.

FSD even out of beta will still be a L2/3 system, by design, legally. Tesla is not going to take on the legal weight of responsibility, not until we see things like steering wheels be removed from cars.

A close friend of mine was Tesla's former head attorney, he spent a VERY large portion of his time on the legal ramifications of FSD. In the end, it's not something you should expect Tesla to take on any legal responsibility until it is statistically 100-1000X lower in accident rates than a human.
 
  • Informative
Reactions: jerry33
FSD even out of beta will still be a L2/3 system, by design, legally. Tesla is not going to take on the legal weight of responsibility, not until we see things like steering wheels be removed from cars.

A close friend of mine was Tesla's former head attorney, he spent a VERY large portion of his time on the legal ramifications of FSD. In the end, it's not something you should expect Tesla to take on any legal responsibility until it is statistically 100-1000X lower in accident rates than a human.
When an L3 system is enabled the manufacturer is liable.
The plan is to make FSD unsupervised (I admit he might be talking about a future product also called FSD).
 
But Tesla the manufacturer would still be liable for defects in the system (i.e. any at fault collision). There is no way for Tesla to limit their liability unless there are laws passed to cap damages involving autonomous vehicles.
Disagree because this all depends on the contracts involved.

What I was trying to explain with my earlier post is that society nowadays sees the use of motor vehicles as a risk that has to be insured against.

The fact that the vehicle is driven by a human or by software does not change the above. The main question is: what insurance company will be brave/dumb enough to insure a vehicle driven by software? Tesla Insurance, that's who (at first). And they'll only start doing this if they know damn well that the accident rate (at fault) of FSD software is many times lower than the accident rate of humans.

When an autonomous vehicle has a crash "at fault" (according to traffic laws), this does not automatically mean the autonomous software stinks. Some instances both parties are doing something risky but one is considered at fault but that is the risk of entering traffic.

(example: I drive behind a vehicle on a highway. The lead vehicle swerves for a piece of debris. Due to sunlight glare I don't notice the debris timely so I hit it and I puncture a tire, spinning me out of control and causing a crash with others involved. I'll be labeled "at fault" according to traffic laws (couldn't avoid road debris that another driver did avoid) but many humans and even possibly FSD would've made the same mistake. If FSD failed in this instance like a human could/would sometimes, it doesn't mean the FSD software is broken/defect.)

Taking part of traffic is a risk. Always. And that risk comes from propelling heavy blocks of steel (cars) at high speeds, a risk that is forced upon other drivers and VRUs. That's why countries made laws to make liability insurance for motor vehicles obligatory. (Not for VRUs, they are generally not seen as the contributor to/cause for high damage accidents)

To your point: if FSD is "faulty" (in other words the software is buggy) then I just don't think we're at a stage where autonomous driving without supervision will be insured. So it's kind of a moot point. The defects in the software will either be gone before autonomy is here, or they'll be so rare that the insurance companies just cover these edge cases.

(When I say "autonomy is here" I mean that you are legally ALLOWED to drive without supervision.)

In the case of Mercedes claiming they're responsible if their L3 software fails, this is a voluntary commercial claim they make in order to attract customers. Legally those Mercedes vehicles will have their liability insured by a third party (an insurance company) and that insurance company can then TRY to recoup their payments by sueing Mercedes if the accident happened due to the L3 software. But this will be a very steep uphill legal battle, since Mercedes has thrown in enough caveats (no sun glare, must have lead vehicle within x meters, no faster than, etc etc).

I don't think a trend will develop in which manufacturers will contractually guarantee you won't have "at fault" crashes whilst using their software. That is an impossible thing to promise. OK, they can cover any damages, but that won't happen with full autonomy. In the case of Mercedes it's easy to make a claim like that since the traveling speeds are so slow it'll always be small claims. If you were using Mercedes L3 and another car hits you (he's at fault), then Mercedes does not cover jack sugar. Only when their autonomous software can't stop for something in very specific conditions. That's quite a hollow guarantee. Tesla could do the same but wisely chooses not to.
 
Last edited: