Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

FSD - Interesting Wording!!

This site may earn commission on affiliate links.
Full Self-Driving Capability
All you will need to do is get in and tell your car where to go and your Tesla will figure out the optimal route.
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory
approval which may vary widely by jurisdiction.

"...your Tesla will figure out the optimal route" = Doesn't exactly say that it will drive itself!

Even if Tesla wants to release it, imagine how slow Congress works as well as individual state legislatures! IF they would even consider this technology to be available for all on the roads is also unknown, but should expect some states not to be too keen because this will eventually cost lots of jobs. Politics of this might be too slow and for those who got the car now, it might just be too old by the time (IF ever) FSD becomes legal.
 
  • Like
Reactions: losangeles
In reality many places in US already allow self-driving cars. Places like GA, TN, ... In fact even CA allows it, just with some limitations about reporting I think?

No, CA approved Waymo's 34 or so cars in a limited number of suburbs in NorCal to test their driverless cars. CA may not have issues passing legislation because after all it's the Silicon Valley hub!
 
  • Like
Reactions: losangeles
Tesla will figured out the optimal route, and then YOU fully self-drive the car to the destination.. Full Self Drive! But seriously, the only thing it promised is that it will figure out the route which any GPS could do. I think it used to say you don't have to provide any input or hold the steering wheel during the drive. Now it does not say that.
 
Full Self-Driving Capability
All you will need to do is get in and tell your car where to go and your Tesla will figure out the optimal route.
Please note that Self-Driving functionality is dependent upon extensive software validation and regulatory
approval which may vary widely by jurisdiction.

"...your Tesla will figure out the optimal route" = Doesn't exactly say that it will drive itself!

Even if Tesla wants to release it, imagine how slow Congress works as well as individual state legislatures! IF they would even consider this technology to be available for all on the roads is also unknown, but should expect some states not to be too keen because this will eventually cost lots of jobs. Politics of this might be too slow and for those who got the car now, it might just be too old by the time (IF ever) FSD becomes legal.

Saw that as well. Basically a description of navigation software..
 
You have to read the whole thing. Here is what Tesla has on its website,

"Full Self-Driving Capability
All new Tesla cars have the hardware needed in the future for full self-driving in almost all circumstances. The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.

All you will need to do is get in and tell your car where to go. If you don’t say anything, the car will look at your calendar and take you there as the assumed destination or just home if nothing is on the calendar. Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs and roundabouts, and handle densely packed freeways with cars moving at high speed. When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.

The future use of these features without supervision is dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving capabilities are introduced, your car will be continuously upgraded through over-the-air software updates."


The wording is fine but who is going to take the responsibility when FSD is involved in an accident?
 
  • Like
Reactions: shrfu31
Actual FSD operation requires a lot more than maintaining speed, lane keeping, lane changing, entering exit/entry ramps, ...

What about hearing emergency vehicles or trains that you can't see until too late to adjust?

What about obeying hand signals from police or emergency responders?

What about insurance liability? Is the vehicle owner responsible if there is a major accident or fatality? Or would it be the manufacturer who has essentially provided the driver?

While Musk keeps saying the FSD software should be "feature complete" by year-end, does that mean FSD will be ready to safely use in most driving situations (even in driver assist mode) or that it's close to receiving regulatory approval - probably not...

But that's OK - NOAP has flaws - but is a considerably improvement over the AP software from a year ago - and we expect a year from now NOAP on limited access highways should be working much better - and we'll find it useful to help in driving on other roads.
 
...What about insurance liability? Is the vehicle owner responsible if there is a major accident or fatality? Or would it be the manufacturer who has essentially provided the driver?...

Elon Musk said it is like owning an elevator. When something's wrong, the owner has insurance to cover it.

He said, if there's an FSD mishap, your insurance would cover it first and if it's Tesla's system fault, Tesla would pick up the bill.

So, he did cover about financial coverage by Tesla if necessary but he did not cover who would go to jail.

I think when something goes wrong, an owner of a property would be liable whether there's a presence or absence of the owner at a property.

The money part is easy to cover but if someone needs to be held in a criminal case, it's usually the owner of the property first, then it's up to the owner to pass the buck to the manufacturer.

But manufacturers are seldom going to jail (For example current passenger deaths from Boeing 737 Max which were announced safe by both Boeing and the FAA for days after the world grounded them.)

...While Musk keeps saying the FSD software should be "feature complete" by year-end...


When Kirsten Korosec from Tech Crunch asked about the problem of calling full self-driving capability and Elon Musk said:


"I think where we're very clear with you know when you buy the car it's meant by full self driving it means its feature complete. But feature complete requiring supervision and then as we get more - we really need billions of miles if not maybe 10 billion sort of miles or kilometers on that order collectively from the fleet then in our opinion probably at that point supervision is not required but that will still be up to regulators to agree. So we're just very quickly there's really three steps. This being:

1) feature complete proportion self driving but requiring supervision,

2) future complete but not requiring supervision, and

3) feature complete not requiring supervision and regulators agree.
 
Elon Musk said it is like owning an elevator. When something's wrong, the owner has insurance to cover it.

He said, if there's an FSD mishap, your insurance would cover it first and if it's Tesla's system fault, Tesla would pick up the bill.

So, he did cover about financial coverage by Tesla if necessary but he did not cover who would go to jail.

I think when something goes wrong, an owner of a property would be liable whether there's a presence or absence of the owner at a property.

The money part is easy to cover but if someone needs to be held in a criminal case, it's usually the owner of the property first, then it's up to the owner to pass the buck to the manufacturer.

But manufacturers are seldom going to jail (For example current passenger deaths from Boeing 737 Max which were announced safe by both Boeing and the FAA for days after the world grounded them.)




When Kirsten Korosec from Tech Crunch asked about the problem of calling full self-driving capability and Elon Musk said:


"I think where we're very clear with you know when you buy the car it's meant by full self driving it means its feature complete. But feature complete requiring supervision and then as we get more - we really need billions of miles if not maybe 10 billion sort of miles or kilometers on that order collectively from the fleet then in our opinion probably at that point supervision is not required but that will still be up to regulators to agree. So we're just very quickly there's really three steps. This being:

1) feature complete proportion self driving but requiring supervision,

2) future complete but not requiring supervision, and

3) feature complete not requiring supervision and regulators agree.

The display screen could say in bold letters: FSD AT FAULT and Tesla will say: Not our fault! :rolleyes:
 
Elon Musk said it is like owning an elevator. When something's wrong, the owner has insurance to cover it.

He said, if there's an FSD mishap, your insurance would cover it first and if it's Tesla's system fault, Tesla would pick up the bill.

So, he did cover about financial coverage by Tesla if necessary but he did not cover who would go to jail.

I think when something goes wrong, an owner of a property would be liable whether there's a presence or absence of the owner at a property.

The money part is easy to cover but if someone needs to be held in a criminal case, it's usually the owner of the property first, then it's up to the owner to pass the buck to the manufacturer.

But manufacturers are seldom going to jail (For example current passenger deaths from Boeing 737 Max which were announced safe by both Boeing and the FAA for days after the world grounded them.).

Good post! Do you have a source for Elon saying that whatever your insurance didn't pick cover, Tesla would pick up? I believe you, but hadn't heard that.

Also, staying with the financial side of things for a minute, your (full) self-driving car is at fault for an accident while under FSD. Your insurance covers it. OK...now your insurance wants to raise your rates going forward. So that's an ongoing cost that doesn't concern Tesla but will concern owners.

The whole liability thing is going to get tricky soon...well before FSD. As soon as "confirm less" / no-stalk NoA turns on, it will get muddy. Because it's still considered driver assist and I'd imagine there will be an (additional) legal disclaimer saying that if you turn this option on, you take the risk and know it is in beta and that you have to be responsible at all times, etc, etc. But it will get muddy because you can be holding the wheel and looking ahead in traffic like a good driver should and all of a sudden, the car could make an AP-initiated lane change. If that is a bad lane change, and you sideswipe another car, Tesla could say that you have all the responsibility because you agreed to the legal disclaimer. But it seems a bit muddy to me because while active cruise control has existed for quite some time (even if it is usually less advanced than Tesla's) and even now lane assist is becoming more common (and keeping hands on the wheel is somewhat intuitive to allow easy take-over if something goes wrong)...auto-changing lanes w/o confirmation is a brand new beast!

The car decides that it needs to change lanes. You are driving along holding the wheel and looking out to traffic - not inside the car to the IC. The car moves left (or right) to change lanes. You are momentarily surprised, and immediately check your rearview mirror, trying to be engaged, but in the second that you need to react and check the mirror - too late - boom, you give the car that AP didn't "see" a kiss and you have an accident. Is it really your fault? What could you have done in the scenario above to avoid this? I mean, other than turning NoA and confirmation less lane changes off. Which you totally could. But then Tesla doesn't get the additional billions of miles of data they need to progress the technology further. So it gets muddy in my mind. And of course FSD takes to this another level.

If it was a situation where someone has to go to jail - and of course we all hope we are never involved in that kind of accident, AP, FSD or just ourselves driving manually, yeah, that is an interesting point. I assume it would be the driver? But even that brings up issues: the owner is the one who accepted the AP legal disclaimers. Let's say I am the owner. I disclaim AP and take full responsibility for NoA auto-lane change. I loan the car to a friend who is familiar with Tesla and AutoPilot. S/he clicks Nav on AP and uses it. It causes an accident during a lane change. Worse case, it is a major accident and someone needs to go to jail. Does my friend go to jail? After all, s/he was driving. Or do I go to jail, even though I was sitting at home? Probably the legalese said I was responsible for telling any other drivers of the limitations of Nav on AP. But can my legal acceptance, even if I did inform them of all the same information, make them legally responsible, too? Hmmm.



When Kirsten Korosec from Tech Crunch asked about the problem of calling full self-driving capability and Elon Musk said:


"I think where we're very clear with you know when you buy the car it's meant by full self driving it means its feature complete. But feature complete requiring supervision and then as we get more - we really need billions of miles if not maybe 10 billion sort of miles or kilometers on that order collectively from the fleet then in our opinion probably at that point supervision is not required but that will still be up to regulators to agree. So we're just very quickly there's really three steps. This being:

1) feature complete portion of self driving but requiring supervision,

2) future complete but not requiring supervision, and

3) feature complete not requiring supervision and regulators agree.

Yes, the above three phases are exactly correct. But in general, Tesla and Elon haven't communicated these steps clearly/often enough.

But as I stated in my long example above, the "requiring supervision" part can get muddy as there will be times when the FSD system will do something fast enough (sometimes necessarily so) that no amount of "supervision" might allow the human to realistically intervene.
 
...source for Elon saying that whatever your insurance didn't pick cover, Tesla would pick up? I believe you, but hadn't heard that...

I only remember I heard it not from prints but from an audio source quite some time ago and I remembered it because he gave an example of like owning an "elevator." It could be in one of those phone conference calls.

...jail...

Most car accidents would be covered by money and no one would go to jail as a criminal case.

However, that's not unheard of. If a victim is a VIP for example, money won't be enough, blood from the driver must be paid!

If the accident is intentional such as the driver willingly to get drunk and drive.

And in cases that I don't drive but I would willingly let a drunk driver to borrow my car, I could go to jail too!

In FSD, if I willingly let Timothy James McVeigh to borrow my car, and instead of driving it, he would let the car drive on its own to Oklahoma City Federal building and blow it up, I could go to jail too!

..."requiring supervision" part can get muddy...

Currently, I would see driver is responsible for a foreseeable future even in Tesla's FSD because it won't be off a beta program any time soon.

But once Tesla announces that FSD is no longer in beta phase but a final production quality, I expect Tesla will take responsible for system's failure but I don't see how we would be able to stay out of jail if there's a criminal case when something goes wrong (such as a car accident victim is a VIP and some heads must roll as a revenge.)
 
Updated from Tesla Autonomy Day:

When asked will Tesla cover for damages if there's an accident when an owner enrolled Tesla RoboTaxi, owner or Tesla, Elon Musk replied in short repeated phrases: "Tesla. Probably. Probably, Tesla."

Yeah, I heard that yesterday. Though "probably" (and the way he said it) doesn't exactly convey to me that they have thought long and hard about that and are brimming with confidence. But still, even though I am somewhat centrist in my views towards regulations, I think that the (insert your country here) government might require companies that claim unattended full self driving to insure their cars for accidents caused while being in FSD/level 4/whatever we will call it mode. Sort of a put your money (literally) where your mouth (advertising) is sort of requirement. Early days for all of this. We'll see what happens.
 
Musk's comment on liability may not end up being Tesla's official position when FSD is activated, because it would expose Tesla to a lot of damages when accidents occur (being "safer than a human driver" doesn't eliminate accidents - only reduces them), and would Tesla offer to accept that liability on their own, or will they wait for the insurance companies or governments to require them to do that?

Since there will be other self-driving vehicles on the road (in limited conditions) before Tesla tries to get regulatory approval, the liability issues could become much clearer well before Tesla has FSD ready to roll - which could mean Tesla will have to accept responsibility while FSD is engaged, or that they'll only accept liability if FSD is responsible.