Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

UK FSD Discussion

This site may earn commission on affiliate links.
Not really.. if the car is truly self driving the human 'driver' is just a passenger and not resonsible for any accidents.

Therefore the responsibility falls onto the manufacturer of the computer system that is driving. If the manufacturer isn't prepared to take on that responsibility it shows they don't believe their system is safe enough for general use.
While an automated driving system might prevent you having 99% of accidents…it would actually cause the manufacturer to have 100% responsibility that it never had before
 
The onus of the car manufacturer having liability is dragging the whole project down.
If you break your neck in a car accident because the software got it wrong, are you going to just think “damn but never mind, statistically - couple of people somewhere haven’t been in an accident and I’ve taken one for Team Tesla”?

The choice is simple, either Tesla pay or your insurance will have to take on the extra obligation and of course for them to do so they’ll want to be satisfied on the quality and appropriate use.
 
Which begs the question, if FSD disengages 1 second before impact because it has run out of options, then technically you are the one 'in control' when the accident happens.
Where do we draw the line?
That’s why even L3 has a 7-10 second required handover where the car is responsible and something Tesla aren’t even trying to implement which is why they’re stuck at L2.

The car needs to know it can cope 100% long enough for the driver to be cognitively aware of what’s going on and take over control. At the moment, Tesla just runs out of talent, and while on the US that doesn’t happen that often compared to us, it still fundamentally doesn’t know what it can’t do until it can’t do it.
 
[FSD] ... and even happily power slide around a roundabout for fun

Yes please ... where do I pay? :)

either Tesla pay or your insurance will have to take on the extra obligation and of course for them to do so they’ll want to be satisfied on the quality and appropriate use

If FSD is, on average, X times better than a human the insurance company actuaries should be happy, shouldn't they?
 
  • Funny
Reactions: M3noob
Which begs the question, if FSD disengages 1 second before impact because it has run out of options, then technically you are the one 'in control' when the accident happens.
Where do we draw the line?
Because it shouldn’t - so issue is with manufacturer.

It’s not like AP/FSD etc where car can give up with no notice. Level 3 should give a pre planned takeover (typically quoted to be around 10 seconds) when car falls out of ODD, eg a pre planned exit from traffic jam assist when traffic starts moving.

If car cannot handle a situation gracefully and exits without warning, the fingers point to a failure of the system.

[edit]I should have read past OP as @GeorgeS… beat me to it
 
If you break your neck in a car accident because the software got it wrong, are you going to just think “damn but never mind, statistically - couple of people somewhere haven’t been in an accident and I’ve taken one for Team Tesla”?

The choice is simple, either Tesla pay or your insurance will have to take on the extra obligation and of course for them to do so they’ll want to be satisfied on the quality and appropriate use.
So far we have only one contender in the L3 stakes…Mercedes. But only under extremely strict conditions. Imagine you are in an accident and someone else is hurt…are Mercedes going to lie down and pay up…or will they walk that section of motorway until they find the one thing that is outside their remit. And what then ? your car insurance will drop you instantly and suddenly you are personally liable…that’s ok, you’re a Mercedes owner…you can afford it. But what about us poor Tesla owners ?
 
So far we have only one contender in the L3 stakes…Mercedes. But only under extremely strict conditions. Imagine you are in an accident and someone else is hurt…are Mercedes going to lie down and pay up…or will they walk that section of motorway until they find the one thing that is outside their remit. And what then ? your car insurance will drop you instantly and suddenly you are personally liable…that’s ok, you’re a Mercedes owner…you can afford it. But what about us poor Tesla owners ?
My Insurance (Churchill) already has a whole section of terms around autonomous vehicles, the TLDR is that they cover anything that is approved for use on UK roads.
 
So far we have only one contender in the L3 stakes…Mercedes. But only under extremely strict conditions. Imagine you are in an accident and someone else is hurt…are Mercedes going to lie down and pay up…or will they walk that section of motorway until they find the one thing that is outside their remit. And what then ? your car insurance will drop you instantly and suddenly you are personally liable…that’s ok, you’re a Mercedes owner…you can afford it. But what about us poor Tesla owners ?
Funny how your assumption is Mercedes will do anything to get off the hook yet they’re the only ones who have even started on the L3 journey

People totally misunderstand why Merc have limits on L3. Its an inconvenient truth for the Tesla hardcore that the limits are not because of software capability, the cars can continue to drive along at L2 at higher speeds and without the restrictions, the limits are imposed as a safety limit while confidence with them, the regulators, the police, the insurers, all the interested parties gain confidence and understanding in a true L3 scenario.

If there is an accident I imagine everyone will trawl every inch of the road to find out what happened, but not to apporti9n blame as such although I suspect that might be a biproduct, but to learn and address the cause.

And that’s the only approach that’s going to succeed
 
People totally misunderstand why Merc have limits on L3. Its an inconvenient truth for the Tesla hardcore that the limits are not because of software capability, the cars can continue to drive along at L2 at higher speeds and without the restrictions, the limits are imposed as a safety limit while confidence with them, the regulators, the police, the insurers, all the interested parties gain confidence and understanding in a true L3 scenario.
On the other side, keeping at Level 2 means that Tesla get what they want : training data.

Every intervention is a training video that can improve the software - you don't get that with level 3.

To be fair, from a control perspective we really haven't seen much of this improvement yet, however with end to end nets now in play with V12, this is a particular reason for Tesla to hold out at level 2 for longer.

Hopefully V12 will see the true "march of the 9's" commence, and if V12 is what we hope it is, then improvement could be rapid from here, potentially skipping level 3 all together, which in its present guise at Mercedes IMHO is not really of much practical help to the user.
 
  • Funny
Reactions: Zilla91
Why can't they get training data while offering L3? Remember, it's not L3 or nothing, you go back down to L2 when you're not able to use L3.
With Level 3, the car sees and understands something is not right and asks the driver to take over.

With Level 2, the driver sees something not right, which inherently means the car does not see/understand it, and takes over.

You will get much more meaningful and helpful data I think through Level 2.
 
Hopefully V12 will see the true "march of the 9's" commence, and if V12 is what we hope it is, then improvement could be rapid from here, potentially skipping level 3 all together, which in its present guise at Mercedes IMHO is not really of much practical help to the user.

Technically, the difference between L3 and L4 is very little.

The biggest difference is that with L3 you need a suitably qualified driver in the drivers seat. With L4 you don’t need one in the vehicle - but nothing stopping you have it implemented as a regular vehicle - Cruise being an example.

The ODD between L3 and L4 can be exactly the same. But to implement L4, the AV must be able to operate wholly without supervision within its ODD. That will likely require tighter vehicle approval and potentially more controlled handover, but at end of day, a L3 and L4 can be the same system, but L4 has gone through tighter approval to allow no driver to be present in the vehicle.
 
With Level 3, the car sees and understands something is not right and asks the driver to take over.

Level 3 should not need to ask the driver to intervene is something is ‘not right’.

Level 3 should ask the driver to intervene in a planned and time controlled manner when it detects that the AV is going to leave its ODD, eg going to leave a motorway or traffic jam is clearing to vehicle will be travelling faster than its ODD.

All the while an AV is in its ODD, and during the transition in and out, the AV is operating normally. Nothing exceptional going on to learn from.

If something is ‘not right’ and driver needs to take over (in an unplanned way for >= L3), it’s equally as much a fail in both L2 and L3.
 
On the other side, keeping at Level 2 means that Tesla get what they want : training data.

Every intervention is a training video that can improve the software - you don't get that with level 3.

To be fair, from a control perspective we really haven't seen much of this improvement yet, however with end to end nets now in play with V12, this is a particular reason for Tesla to hold out at level 2 for longer.

Hopefully V12 will see the true "march of the 9's" commence, and if V12 is what we hope it is, then improvement could be rapid from here, potentially skipping level 3 all together, which in its present guise at Mercedes IMHO is not really of much practical help to the user.
1705488569648.png
 
The Merc system can continue to learn - it’s not a binary L2 or L3, it will operate on L3 within its chosen envelope and L2 outside that. Take speed, I think its limited to something like 50km/h on L3, above that the driver has to be ready to take over but the car can still drive just like AP does all the time. It learns and collects stats and reaches the point that says “zero interventions required below 70km/h”, it could then agree with regulators, insurers etc a lift to the speed, all evidence based and more importantly incremental. We could see a steady increase in speed, maybe not following another car, etc and the utility will increase for us, but it’s genuine utility, The Tesla way is different, they’re not looking for zero, they’re looking to trend towards zero. It’s a way it’s a philosophical choice between expecting 100% error free but you define the environment when you can do that, and as close to 100% as you can get but with few restrictions to the environment.
 
  • Like
Reactions: WannabeOwner