Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Smart summon liability?

This site may earn commission on affiliate links.
Fair enough.... The issue for me isn't whether it goes to a waypoint or homes in on the phone. The issue is that the feature requires maintenance of a clear line of sight (and presumes that the operator can visualize what the car is doing from 150 ft away in order to intervene before a crash).

Yes this won't be useful. I wonder if they will stream the cameras to you. It will be moving so slowly that lag won't be too big of a problem.
 
Fair enough.... The issue for me isn't whether it goes to a waypoint or homes in on the phone. The issue is that the feature requires maintenance of a clear line of sight (and presumes that the operator can visualize what the car is doing from 150 ft away in order to intervene before a crash).

It doesn't require it which is why I think it's a bad idea to release this. I picture some idiot standing in line paying for something and trying to have it pick them up in the front of the store. Most people won't do this but it only takes a few idiots, and they exist as shown by the various youtube videos of people using autopilot incorrectly.
 
Funny story- after reading the how the better summon will work thing I too was thinking "Meh that's still gonna be pretty useless..."

Then I left work. It was raining. Hard. The exit I used has a covered area you can drive up to. Car was probably 100-150 feet away parked.

Being a work lot, not a mall, there's no significant pedestrian traffic.

It was the perfect use case for the new summon.

Not being an early update tester I got wet walking instead.
 
  • Like
Reactions: Thp3
It doesn't require it which is why I think it's a bad idea to release this. I picture some idiot standing in line paying for something and trying to have it pick them up in the front of the store. Most people won't do this but it only takes a few idiots, and they exist as shown by the various youtube videos of people using autopilot incorrectly.

It "requires" it, in the sense that the manual apparently says you are misusing the feature if you do not maintain visual contact and control. It doesn't enforce that requirement.

Just like AP requires that the driver maintain attention and control at all time but is incapable of preventing folks from using it in some other manner (as long as the driver touches the wheel in a certain way).

This is just another way for Tesla to act like every accident caused by its tech is actually due to misuse by the driver who failed to stop the tech from doing something wrong.
 
Funny story- after reading the how the better summon will work thing I too was thinking "Meh that's still gonna be pretty useless..."

Then I left work. It was raining. Hard. The exit I used has a covered area you can drive up to. Car was probably 100-150 feet away parked.

Being a work lot, not a mall, there's no significant pedestrian traffic.

It was the perfect use case for the new summon.

Not being an early update tester I got wet walking instead.

You would think, I wouldn't get my hopes up if I were you. ;-)

It will have its use case, yours might be the best one, but for 95% of the other people it will be a parlor trick to show friends which is a shame.
 
Sorry if this has been asked already, but who is responsible if smart summon fails and runs into a wall, a car, or a person? Given that it supposed to come to you from anywhere in the parking lot, I can't imagine the driver be held responsible for it. Just wondering if this has been addressed?
You and your insurance company. You the human remains responsible for the safe operation of the vehicle, and stopping it as appropriate.
 
It's been demonstrated that the HW2/2.5 camera setup has blind spots that would make self-navigating parking lots very dangerous.

@AnxietyRanger you had a picture of these blind spots in front corners, near bumper. @Bladerskb you discussed a video that showed how the camera view is when backing out from between two parked cars
It isn’t any different than your eyes. You can see something three feet away better than your shoe tip, unless you look straight down and then you can see your shoe tip better than the tip of your nose.
 
The data Tesla has released says nothing about how often they “run over kids” in parking lots which is I believe what you were discussing. So we can’t compare a Tesla on Summon in a parking lot to other cars driven by humans in parking lots (and I’m not aware of “kid” fatality data in parking lots).
No, it discusses how often the automated systems fail at speeds 5-10 times higher than parking lots, where the sensors are less useful and accurate than they would be in parking lots.
And Tesla’s data that they released defines an “accident” as an incident where the airbags go off. Which would almost never happen if a Tesla was in a parking lot and hit something, human or otherwise, because the impact speed would be too low.

So Tesla’s released data simply does not apply to the scenario you raise.
 
And Tesla’s data that they released defines an “accident” as an incident where the airbags go off. Which would almost never happen if a Tesla was in a parking lot and hit something, human or otherwise, because the impact speed would be too low.

So Tesla’s released data simply does not apply to the scenario you raise.


...it applies to a massive # of miles at much higher speed, where risk of accident and injury is tremendously higher.

So situations at vastly lower speed, with much lower risk of injury or accident, should look even better for Tesla.
 
I sincerely hope this would work.

But... parking lots are some of the most difficult and unpredictable environments. Yes, it is hard to kill someone at those speeds, but the chaotic and unregulated nature of the lots make it some of the most accident prone areas. I would expect highway hands free driving to be well perfected before parking lot summon is feasible outside of the most trivial use cases.
 
...it applies to a massive # of miles at much higher speed, where risk of accident and injury is tremendously higher.

So situations at vastly lower speed, with much lower risk of injury or accident, should look even better for Tesla.
Yep you can run over a hundreds of kids at parking lot speeds without a single injury to the driver. Parking lots are the safest driving environment and the perfect place for testing FSD.
I sincerely hope this would work.

But... parking lots are some of the most difficult and unpredictable environments. Yes, it is hard to kill someone at those speeds, but the chaotic and unregulated nature of the lots make it some of the most accident prone areas. I would expect highway hands free driving to be well perfected before parking lot summon is feasible outside of the most trivial use cases.
I wish they would perfect self parking before they work on gimmicks like this.
 
Why imagine when you can read from the ordering page:

"The currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving features evolve, your car will be continuously upgraded through over-the-air software updates."

In summary: Driver is responsible.
No, you don’t know until a case is tested before a judge
 
It works as advertised the same way as there have been Autopilot crashes. Really!

There have been plenty of summoning crashes already. Really.

It's the same way that people have been relying on the backup safety system of Tesla Automatic Emergency Braking and wonder why the AEB didn't brake to avoid crashes. Really.

Users need to read what's the catch.

Tesla says its Full Self-Driving is not "vehicle autonomous" and that is the catch. Really.
Unreal
 
No, you don’t know until a case is tested before a judge

You are right about different people have different understanding.

But first, for a legal case, what's in the contract.

Right now, it says driver is responsible.

Then, you can go to the court and argue that that contract is wrong because it should say Tesla is responsible.

Then the jury will decide which interpretations of the contract is reasonable.
 
You are right about different people have different understanding.

But first, for a legal case, what's in the contract.

Right now, it says driver is responsible.

Then, you can go to the court and argue that that contract is wrong because it should say Tesla is responsible.

Then the jury will decide which interpretations of the contract is reasonable.
Let's say "smart" summon runs over a kid in a parking lot. If there is a lawsuit it will be against both the "driver" and Tesla. The kid didn't sign a contract with Tesla. I could see a jury siding with the kid but who knows.