Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Growing FSD liability could be massive.

This site may earn commission on affiliate links.


“For those that have followed the Tesla story, you know that CEO Elon Musk has basically talked about full self-driving for almost a decade now. The company still has not shown off the U.S. coast-to-coast autonomous drive that it said would come by the end of 2017, for instance.”

“The potential liability question comes into play here if you think Tesla won't be able to truly solve FSD with the current hardware. One might think that a potential lawsuit would just want to refund the FSD price, and those arguing here might cite the recall number of vehicles to get a total liability. At an average cost of $10,000 for FSD, that gets you to around $4 billion when you consider how many customers have reportedly paid for the package so far.”

Screenshot 2023-02-26 1.51.03 PM.png

Image source: Full Self-Driving Computer Installations | Tesla Support Other Europe
Image added for Blog Feed thumbnail
 
We are taking about FSD purchasers, not your grandma buying a Tesla.

Highly doubt ANYONE that dumped 7-15k on FSD purchase had not heard of or read about Musk's comments about it.

The only reason I even paid for it was because of his comments on AI day and that Robotaxies would be here by end of 2020 and all current FSD owners would be able to opt in (and generate revenue).

Fact that is not even going to happen by end of 2023, and that all these purchasers dont even have the option to transfer to a new car so they someday will be able to use what was promised is how every FSD owner feels, no matter how much you want to deny it from your fan boi view.
Ah, there it is. I'll leave you to your viewpoints. Unless I can offer you a hot beverage.
 
Also, the concept that buyers of FSD "only" got their info about Tesla from Elon's twitter, is a fundamentally flawed argument.

One random (of possibly..hundreds) example of why people may have purchased FSD back then...even says the car will charge by itself without human input! (where is that these days by the way)

Note: this was when the emperor was still wearing all of his clothes..leading up to Time Magazine Man of the Year. All of that. In other words, before today...where lawsuits about FSD are happening weekly at this point
 
  • Like
Reactions: pilotSteve
Also, the concept that buyers of FSD "only" got their info about Tesla from Elon's twitter, is a fundamentally flawed argument.

One random (of possibly..hundreds) example of why people may have purchased FSD back then...even says the car will charge by itself without human input! (where is that these days by the way)

Note: this was when the emperor was still wearing all of his clothes..leading up to Time Magazine Man of the Year. All of that. In other words, before today...where lawsuits about FSD are happening weekly at this point
Elon is timing this. People clamoring, building up huge lawsuits, and on the court judgement date, voila! He will release FSD.
 
Elon is timing this. People clamoring, building up huge lawsuits, and on the court judgement date, voila! He will release FSD.
It is highly unlikely that there is any iluminati level thinking here; just a race against time on a very expensive bet. If you look at the history of Musk's enterprises, he has time and again bet the farm on challenging goals and so far he has done remarkably well. The big question is which event will come first, the court judgments or the autonomy.
 
It is highly unlikely that there is any iluminati level thinking here; just a race against time on a very expensive bet. If you look at the history of Musk's enterprises, he has time and again bet the farm on challenging goals and so far he has done remarkably well. The big question is which event will come first, the court judgments or the autonomy.
If you look closely there were other challenges that had him distracted such as Twitter, Starlink, Starship etc
 
you take control within 10 seconds.

This is a major factor to consider.

Within the next 10 seconds your car will no longer be able to take responsibility for it's safe progress along the highway.

Tesla's version is 'your car lost the plot a moment ago, we never claimed it could fully do anything and you (the driver) were always responsible.

That sounds like the driver self driving, not the car. Based on that, Tesla hasn't delivered self driving by the car, or self driving capability. If it had, then it could tell you even as little as 1 second in advance of the car not being able to continue self driving. But it can't.
 
some of its present limitations are down to regulatory differences

As a gullible UK FSD owner, I have yet to see any evidence that Tesla has made any serious effort to address regulatory challenges. For 6 months of the year my car often has blinded or fogged up b-pillar cams, so at best it might deliver some kind of FSD half of the year.

It has yet to self park.

It often shows traffic lights with stop lines driving on motorways (freeways / interstate).

The notion that FSD has delivered anything worthwhile beyond original EAP is a non-starter.
 
Last edited:
As a gullible UK FSD owner, I have yet to see any evidence that Tesla has made any serious effort to address regulatory challenges. For 6 months of the year my car has blinded or fogged up b-pillar cams, so at best it might make some kind of FSD half of the year.

It has yet to self park.

It often shows traffic lights with stop lines driving on motorways (freeways / interstate).

The notion that FSD has delivered anything beyond EAP is a non-starter.
You are aware that the FSD preview you mention has almost nothing in common with the current (US/Canada only) FSD beta, right?
 
This is a major factor to consider.

Within the next 10 seconds your car will no longer be able to take responsibility for it's safe progress along the highway.

Tesla's version is 'your car lost the plot a moment ago, we never claimed it could fully do anything and you (the driver) were always responsible.

That sounds like the driver self driving, not the car. Based on that, Tesla hasn't delivered self driving by the car, or self driving capability. If it had, then it could tell you even as little as 1 second in advance of the car not being able to continue self driving. But it can't.
You do realize the panic it creates when someone who is “advised” to drive hands free and do other things while the car is driving itself, and then you get called to take over. In 10 seconds tou have to stop whatever you are doing, get a handle on the situation.

That being said, I do appreciate the failover task in which “the system brakes the vehicle to a standstill in a controlled manner while engaging the hazard warning lights.”

The concern I have is during the time when the driver is taking over, in a panic mode, and ends up crashing the car, Will Mercedes take liability? Of course not.
 
You are aware that the FSD preview you mention has almost nothing in common with the current (US/Canada only) FSD beta, right?

Of course. Hence different opinions as to if FSD has in any way been delivered outside of NA.

I hear rumours of testing in Europe but seen no hard evidence of anything significant. If Tesla are ditching HW3 then I reckon FSD Beta or equivalent will never be delivered to existing Euro / UK FSD owners. My car became 'legacy model' a few months after purchase and only 'developments' on TACC / AP / FSD have been turning off radar cuz they can't get it to integrate with other sensors. Since super unlikely HW4 will be offered on HW3 cars, my low res and b-pillar cams make render my car FSD incapable.

the FSD preview you mention

I don't recall mentionning 'FSD Preview'. I certainly didn't by FSD Preview (!) I bought FSD capability and FSD software in beta. Or FSD Beta mk1, or FSD Beta / UK. Otherwise k own and sold as FSD! The same capability as any other FSD capable Tesla with HW3.

Thankfully when the visualisation shows spurious traffic lights in the middle of a freeway, the car does not attempt to slow down. Same for the myriad claimed emergency Braking events for my safety that don't effect the car's speed.

But traffic light control does attempt to work on regular city streets in the UK. Just loses the plot with multiple sets of lights close together.
 
Last edited:
You do realize the panic it creates when someone who is “advised” to drive hands free and do other things while the car is driving itself, and then you get called to take over. In 10 seconds tou have to stop whatever you are doing, get a handle on the situation.

That being said, I do appreciate the failover task in which “the system brakes the vehicle to a standstill in a controlled manner while engaging the hazard warning lights.”

The concern I have is during the time when the driver is taking over, in a panic mode, and ends up crashing the car, Will Mercedes take liability? Of course not.

I was focusing purely on the idea that one approach (MB) at least embraces the idea that the car needs to be aware in advance of a situation such that auto control can't be maintained.

Tesla's implementations seem fundamentally schizophrenic lurching from 'total confidence' to 'total colapse' in many situations.

To drive safely, the car must have a 'window of confidence'. If FSD beta ever delivered this I can't say based on first hand experience, but I haven't see evidence that it has. UK versions have regressed imo.... eg: no off ramps.
 
  • Disagree
Reactions: EVNow
To drive safely, the car must have a 'window of confidence'. If FSD beta ever delivered this I can't say based on first hand experience, but I haven't see evidence that it has. UK versions have regressed imo.... eg: no off ramps.
The beta definitely shows that is doing this kind of thing .. in many cases the car slows when it observes situations where there is some ambiguity to other cars behavior. The NNs rate confidence levels in the various trajectory predictions, and the car clearly takes these into account when accessing how cautiously to drive .. in most cases the car is more cautious than a human (or, at least, THIS human).
 
  • Informative
Reactions: pilotSteve
at least, THIS human

Lol.

What is most frustrating in UK is why Tesla have never delivered any benefits of the FSD City behaviour. Here the car does not intelligently change speed at all. Speed limit says 60mph...... drive at 60! Sharp bend ahead, if the speed limit says 50, must be OK to go 50.

Has single stack actually happened in the US? I believe not. If that's the case, are US cars running same / similar code on freeways as the only code we run over here?
 
I was focusing purely on the idea that one approach (MB) at least embraces the idea that the car needs to be aware in advance of a situation such that auto control can't be maintained.

Tesla's implementations seem fundamentally schizophrenic lurching from 'total confidence' to 'total colapse' in many situations.

To drive safely, the car must have a 'window of confidence'. If FSD beta ever delivered this I can't say based on first hand experience, but I haven't see evidence that it has. UK versions have regressed imo.... eg: no off ramps.
I like your articulation but have to disagree on the “window of confidence” concept.

It is a binary choice of 0 or 1. Either you trust the driver or you don’t. That driver could be a well trained chauffeur or an unknown uber driver or a software program. We don’t go about identifying the windows of confidence when we hail a taxi or call an uber. Do we?

At least Tesla is calling it out as “Don’t Trust” it yet.
 
The 'window of confidence' I was referring to would be evidenced by the fact that driving along a street you (human) could probably blink your eyes shut for 1 second out of 3 in some situations without compromising safety, because you know what's possible given what you see at any given moment. Your 'window of confidence'.

Now driving around a busy city in rush hour you would have too much going on to risk shutting your eyes at all. May be 0.5 second every 5 seconds?

MB would appear to believe that their system would know of an impending overload 10 seconds before it happened. (I too doubt they would take responsibility for those 10 seconds though!) But at least they put a figure on it and acknowledge the need for a handover from 'self drive' to 'human drive'. Not just 'take over right now' which begs the question if the car was ever really in control.
 
Last edited:
As a gullible UK FSD owner, I have yet to see any evidence that Tesla has made any serious effort to address regulatory challenges. For 6 months of the year my car often has blinded or fogged up b-pillar cams, so at best it might deliver some kind of FSD half of the year.

It has yet to self park.

It often shows traffic lights with stop lines driving on motorways (freeways / interstate).

The notion that FSD has delivered anything worthwhile beyond original EAP is a non-starter.
Autonomous driving is still illegal in the UK. They are planning to set up the legal framework and certification program by 2025. Pretty sure in its current state Tesla's FSD won't come anywhere close to passing it. I mean, the thing cannot even navigate stationary traffic on a parking lot.

Speaking of which, got a message from Tesla's rep yesterday: "We are expecting the software update to reenable AutoPark soon. There is no definite date yet." Same thing they said two months ago.
 
  • Like
Reactions: 2101Guy