Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Growing FSD liability could be massive.

This site may earn commission on affiliate links.


“For those that have followed the Tesla story, you know that CEO Elon Musk has basically talked about full self-driving for almost a decade now. The company still has not shown off the U.S. coast-to-coast autonomous drive that it said would come by the end of 2017, for instance.”

“The potential liability question comes into play here if you think Tesla won't be able to truly solve FSD with the current hardware. One might think that a potential lawsuit would just want to refund the FSD price, and those arguing here might cite the recall number of vehicles to get a total liability. At an average cost of $10,000 for FSD, that gets you to around $4 billion when you consider how many customers have reportedly paid for the package so far.”

Screenshot 2023-02-26 1.51.03 PM.png

Image source: Full Self-Driving Computer Installations | Tesla Support Other Europe
Image added for Blog Feed thumbnail
 
Tesla will craft the refund process, if and when they get forced into it, to minimize the refunds.

They will probably toss more features into EAP and raise the price, so if you want a refund, you get less refund, or you have to give up not just auto-park and auto lane change and nav on autopilot, but a few other things. People who bought FSD are the sort of people who like a car "fully loaded" and so many will not want to unload their car just to get back $5K. (Many paid $10K to $12K for FSD and EAP downgrade would only be a $4K to $6K refund plus interest.) But they might make the difference smaller by tossing in some more features.

I don't know exactly what they will do but I know they will hunt for things, and probably will find them.

I paid $2K for FSD during the tiny window that it was that price, so my refund is not that likely.
There once was a man who had a sick cow, and not enough money to buy a new cow. So he puts up a lottery. Buy the $1 ticket and get a chance to win a cow. The whole village and surrounding villagers were excited and all of them got themselves $1 lottery ticket.

On the day of picking the winner, the man picks the winning number and calls the winner to pick up the cow. The winner comes and looks at the cow and refuses to take the sick cow. He is mad and demands a refund. The man gave him a $1 refund. Everyone was happy.
 
Tesla will craft the refund process, if and when they get forced into it, to minimize the refunds.

They will probably toss more features into EAP and raise the price, so if you want a refund, you get less refund, or you have to give up not just auto-park and auto lane change and nav on autopilot, but a few other things. People who bought FSD are the sort of people who like a car "fully loaded" and so many will not want to unload their car just to get back $5K. (Many paid $10K to $12K for FSD and EAP downgrade would only be a $4K to $6K refund plus interest.) But they might make the difference smaller by tossing in some more features.

I don't know exactly what they will do but I know they will hunt for things, and probably will find them.

I paid $2K for FSD during the tiny window that it was that price, so my refund is not that likely.

Any "refund" has to be accepted by the people getting the refund. Essentially the "refund" has to be worth more than the customer's time + cost to pursue other options. If the cost is to go through arbitration (even with a well known script and help filling it) then alot of people will take the easy way (whatever Tesla offers to relieve their liability).

I'd note that changing the feature definitions at this point won't help Tesla on existing purchases.
 
  • Like
Reactions: 2101Guy
To me, the largest risk for Tesla would be a collision when using hw3, when hw4 wouldn’t have collided with the other car or a pedestrian. If a higher resolution front camera or the new radar prevented a crash, fault would be more interesting: Tesla has the systems to prevent the crash, but wasn’t using them.
 
I sometimes wonder if the question of ‘who’s at fault/liable’ when an FSD vehicle is involved in an accident might be holding all the FSD efforts back. As far as I know that question has not been resolved, and I assume no carmaker would willingly accept…
 
Before the Elon excuse makers arrive with the usual “bUt mErCeDeS iS LiMiTED to 37 mPH dude”

Yes. because that is how you honestly and responsibly progress with your technology. You start off slow, then expand when safe to do so. That way you don’t end up with your tech slamming into 14+ emergency vehicles with their lights flashing as they are stationary.
 
INCORRECT.
“"You can look away, watch a movie, or zone out.” That is not what Mercedes expects. They expect you to take control in 10 seconds. Impossible if you are zoned out.
Irrelevant to the comment by the post I answered.
MB FULLY accepts liability when their autonomous driving system is used under the conditions that they specify.

Tesla accepts ZERO responsibility for its “FULL full self driving” feature
 
I think Tesla could immediately assuage a great deal of this if they just offered a TRANSFER of FSD to a new vehicle, or say a 66% credit towards a new FSD purchase with a new vehicle.

Most buyers, past and present I think, would like to think that at SOME point in the future FSD will have SOME utility more than it does today, either for personal or commercial use. They bought and OPTION on their current purchase - which may or not NOT be capable of what the future actual vision is (no pun intended).

Keep the loyalists, minimize any “refund” complication - although in the grand scheme of things at THIS point a 2B$ hit to earnings in a year would be far less relevant than in any prior year. But to take it down 90% well that would be enough to make it a rounding error.
 
HW3 no longer includes radar (or ultrasonics for that matter but that is only relevant to EAP features). Short of dramatically more advanced (and resultantly much more expensive) cameras you won't be able to match the distance of human perception at some of the lighting edge cases (high dynamic range and low light vision without being too blurred from longer accumulation time). HD Radar is basically a great way to fill this gap with a relatively cheaper solution which has a different set of degrading conditions (ie the set of times that both radar and cameras would be effectively useless is much lower).
Do you have any evidence to support these assertions?
 
The idea of Tesla being liable for something on HW3 that wouldn't have happened with HW4 is an interesting question. Obviously you (or any entity) don't have to accept liability to be liable; liability is based upon the action or lack of action which resulted in harm of some sort to another party. I kind of doubt that you could get strong enough of a case for HW3 vs HW4 unless you were to argue that HW3 shouldn't even have the feature set it does (and those features were directly responsible for the crash) because it is unsafe. This is probably also untested territory regarding OTA updates enabling new features on hardware that may not be fully up to the task at hand.

Now somewhere that I think there would be real liability risk is in the removal of radar. If someone were to show that radar would have prevented a crash but didn't then you might be getting into the territory of Ford's fiasco with superduty trucks and thinner/lower grade roof steel over time. Essentially blatant cost cutting that invalidated safety ratings without necessary adjustments. It should be noted that juries generally really look down on cost cutting that results in injuries and deaths (which also gets into treble damages etc).
 
  • Like
Reactions: BobbyTables
Do you have any evidence to support these assertions?

I think I made multiple assertions:

Regarding radar having a different set of environments where its data is invalidated: this is generally about the spectrum used; you are unlikely to get a object that is blinding both visual freq and microwave freq. The sun blinding your cameras won't blind your radar and someone's old leaky microwave won't blind your cameras (and the cases where both are blinded also generally would stop a human driver also). With HD radar that is all the better since it now is much better able to distinguish things that you are going to run into vs big objects that are out of your path of travel. As I understand it, If you think about how radar works, it is throwing out signal and then measuring the return over some amount of time, essentially giving a 2d view (time vs intensity) for each frame. HD radar is more akin to a radar camera where each pixel is recording a more focused cone of time vs intensity so you end up with a 4d view (time vs intensity vs height vs width) within the general cone of focus. The result of this is that you now can see that something bouncing back a signal is above the car instead of not knowing a bridge is coming up vs a pedestrian on the road (since things in the center of the cone will have a stronger return than things on the edge).

Regarding better cameras avoiding blinding scenarios: this is more anecdotal from my use of a variety of cameras for different things. Basically I've observed that smaller sensor cameras with cheaper optics are much easier to blind than larger sensor cameras with more precise optics. For clarity, with enough on package processing alot can be done to deal with bright situations by reducing the exposure and then integrating a subsequent image at higher exposure. Low light on the other hand you end up hitting limits on both the exposure time and the base noise level which is basically your floor for the integration time (and longer exposures also blur moving images). With good enough of cameras with enough separation (so not all cameras are blinded in the same spot) you probably can eliminate the vast majority of blinding scenarios but from what I've seen of video from my Tesla and others it seems like the current camera setup is pretty far from that.
 
As I drive my car every day (20 miles or less) I see something different on the roadway or see different driving circumstances and I say to myself. FSD may NEVER be able to drive grandma to the doctor with no intervention. It just doesn't handle this situation. So with all the infinite possibilities that can occur is
there a set of code that can do what I might have to do. People talk about edge cases. Well when grandma dies in a car accident on the way to the doctor because of an edge case wont fly. Tesla should really work on making a car that assists the human driver in almost every way possible
 
Any "refund" has to be accepted by the people getting the refund. Essentially the "refund" has to be worth more than the customer's time + cost to pursue other options. If the cost is to go through arbitration (even with a well known script and help filling it) then alot of people will take the easy way (whatever Tesla offers to relieve their liability).

I'd note that changing the feature definitions at this point won't help Tesla on existing purchases.
No, we are speaking of a refund ruled by a court, which has to be accepted by the court (and the person agreeing to the refund, which will indeed be optional.)

The court is likely to accept that you get a full refund of the FSD price plus interest. Some will argue for more, that they deserve compensation because they never would have bought a Tesla if they didn't think that FSD would eventually show up. We will see how they do, but I think the former terms are likely to pass muster with the court -- your money back, with interest. And of course full removal of FSD.

However, Tesla might convince the court to give you less. They might expect you to pay something for the years you had FSD because you got some features like EAP and a bit more. So if the car's life is 20 years, and you had FSD for 5 years, they might assign a $6K value to EAP, and a $2K value to extra features, for $8, and expect you to get your FSD price minus $2K.

But Tesla will try to make you not take this refund. They might say that "If you don't insist on the refund, you will get things" like:
  • Transfer of FSD to a future vehicle
  • Keep EAP plus a bunch of other special features created to entice you
  • "No, really, just don't do the refund and Elon is absolutely certain FSD will work next year, but you do have to sign this waiver"
  • "Autosteer on city streets" (ie. ADAS not self-drive) as a released product.
 
As I drive my car every day (20 miles or less) I see something different on the roadway or see different driving circumstances and I say to myself. FSD may NEVER be able to drive grandma to the doctor with no intervention. It just doesn't handle this situation. So with all the infinite possibilities that can occur is
there a set of code that can do what I might have to do. People talk about edge cases. Well when grandma dies in a car accident on the way to the doctor because of an edge case wont fly. Tesla should really work on making a car that assists the human driver in almost every way possible
Are you implying that you know more about what FSD can and will do than Elon knows?
 
Are you implying that you know more about what FSD can and will do than Elon knows?
I don't know about him, but yes, a lot of people have come to the conclusion that Elon has blinded himself to the truth about what FSD can and will do, and thus he "knows" less than others. Not, of course, due to lack of access to the information, but due to his own confirmation biases leading him astray. Normally, I would never want to say that about somebody, but if you make predictions about a product for nearly a decade, and claim you are "certain" and are consistently wrong, it is the likely conclusion.
 
As I drive my car every day (20 miles or less) I see something different on the roadway or see different driving circumstances and I say to myself. FSD may NEVER be able to drive grandma to the doctor with no intervention. It just doesn't handle this situation. So with all the infinite possibilities that can occur is
there a set of code that can do what I might have to do. People talk about edge cases. Well when grandma dies in a car accident on the way to the doctor because of an edge case wont fly. Tesla should really work on making a car that assists the human driver in almost every way possible
Between endless edge cases (conditions and other humans), weather and liability - I wouldn’t discount L3/4 may be all we may see in our lifetimes, and drivers will still be liable not manufacturer’. I once thought L5 was just a matter of time, not so sure any more.
 
  • Like
Reactions: Pricedm