Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot disengagement, driving etc. out of main.

This site may earn commission on affiliate links.
Let’s say the average car gets in a parking lot collision every 30 years.

I'm pretty sure that's way too low: number of car insurance claims is around 170m per year in the U.S., but there's three strong filters that significantly reduce filings for minor fender-benders:
  • deductibles and the risk of higher rates with an own-fault claim on record causes people to not file a claim but pay in cash,
  • uninsured for own-damage might not file a claim but pay in cash,
  • there's also a substantial percentage of hit-and-run, both intentional and accidental.
I wouldn't be surprised if the real figure was twice as high, with a lot of the cash settled fender-benders in parking lots.

With ~250m registered vehicles in the U.S., this suggests that the rate of minor fender-benders is closer to once per year.
 
  • Like
Reactions: jerry33
I didn't try it. Don't feel the rush to add to the # of potential accidents. Not a big deal to walk a few steps to the car.

I think we need to take it slowly to give a chance to Tesla to address any issues before piling up 100s of accidents. That collision we saw - would be nice if Smart Summon honked preemptively.

Netflix/YouTube is really cool though. Somehow I am thinking it will have no less of an impact on observers.
I kinda liked the idea of having the flashers on when doing smart summon. I think it would alert all around that something is going on that is different than the current normal.
 
a different topic, seeing smart summon collisions within days of v10 is worrisome. Let’s say the average car gets in a parking lot collision every 30 years. That’s 1 collision daily for every 10,000 cars. After v10 is fully deployed, there will be 100 or 200k cars with SS.

That implies one or two accidents daily might be ok, because owners do not use SS that often. If SS accidents are much higher, SS is causing accidents and Tesla will need to do something.
Every Tesla accident will be reported regardless of how minor, not every parking lot accident is reported.
 
It may be an unpopular opinion on these forums, but the way 1Q was handled is not something that smart money will easily forget. Those who got burned, and everyone who saw them get burned, will think long and hard about whether to trust Tesla again. The volatility is not helping either.

This isn’t the case for retail investors, who are beholden only to their own wallets. The TMC thesis of a disparity between perception and reality is the opportunity here.

————————

On a different topic, seeing smart summon collisions within days of v10 is worrisome. Let’s say the average car gets in a parking lot collision every 30 years. That’s 1 collision daily for every 10,000 cars. After v10 is fully deployed, there will be 100 or 200k cars with SS.

That implies one or two accidents daily might be ok, because owners do not use SS that often. If SS accidents are much higher, SS is causing accidents and Tesla will need to do something.

Once every 30 years sounds excessively low. In ~16 years of driving, I’ve had other cars hit my car while parked(using this stat since it eliminates any potential thing about my driving in particular) 5 times in parking lots.
 
Yes, I do have that enabled. I have to clarify my statement: I do not mean it never uses the HOV, but that it is rarely using it, prefers to leave when I would have stayed in the HOV.

To further elaborate on my problems (including the use of HOV lane), I have documented my problems with NoA on my trip this morning using tesla dashcam clips:

1. The first clip shows that I am traveling on the left-most main lane of the busy 3-lane highway, when the HOV lane becomes available on the left. NoA makes an attempt to switch (about 30s in the clip), but it is too slow, does not accelerate, a car comes up (HOV lane moving significantly faster) so it cancels attempt, I see it gives up on lane change so I take over (at about 37s) and make the change manually on the next opportunity.
Front camera clip.
Left camera of same.

2. A couple of minutes later a highway interchange is coming up for which I need to leave the HOV lane. NoA indicates that it wants to switch, but shows red line as another car travels parallel to me in the next lane. There is space both in front and behind the car, so NoA would need to either accelerate or decelerate a bit to be able to make the switch, but it fails, keeps driving side-by-side. When the dashed section between HOV and main lanes is about to end, I take over and quickly make the switch manually. NoA would have stayed in HOV and miss the opportunity for the upcoming highway interchange.
Front camera footage.

3. Next mistake is still visible on the same clip above (at around 20 seconds in). I have turned back on NoA and now traveling in leftmost lane of the 3, but for the highway interchange I need to be at least in the middle lane (which splits), I am watching NoA and see no indication that it would plan to make the switch, so I take over and switch manually. This mistake would have sent me further down on the wrong highway, causing a significant detour to get back to my destination.

4. The fourth mistake of NoA is still on the above clip (at around 39 seconds in). After the highway split, a new HOV lane starts on the left, but NoA makes no indication that it would make the switch for it, it wants to stay on the busy main lane, while the HOV lane is sparse, so I take over and manually change to HOV.

5. The next time there is a dashed line between the HOV and the main lanes, NoA indicates that it wants to switch to the main lanes leaving the empty HOV for the busy main lanes as shown in the clip here. I have manually cancelled this lane change to stay on the HOV.

6. At the next opportunity, NoA again wants to leave the HOV lane, this time the main lanes are not so busy (but the HOV is completely empty) as shown in this clip, but this time I have allowed NoA to make the change (at about 50s in the video) even though I could have traveled another 10km in the HOV before needing to leave for my destination.

7. Further down the main lanes got busier again, while the HOV was clean (which is why I would have preferred to stay in HOV, I know this traffic pattern from experience). The next clip shows how NoA decides to switch into the busier and slower middle lane from the empty left lane at around 38 seconds into the clip. This is still about 6 km before my exit, so unnecessarily early to make the move to the right.

So that is 7 bad decisions of the NoA within the time frame of roughly half an hour. None of them was safety concern, but two of them (#2 and #3) would have caused me to stay on the wrong highway and miss my interchange.

ps: My sw version number is 2019.8.5. 3aaa23d

I have got V10 installed yesterday, so I made a new test of NoA on the same route (part of my morning routine) to see if it handles this route any better. Unfortunately, it has repeated mistakes 1, 2 and 4 listed above at which point I turned it off and stayed with basic AP.

This time, mistake #1 was even more dangerous, the busy main lanes were moving very slow (about 30km/h), while HOV was sparse but the cars travelling there were going much faster (about 100km/h). NoA has attempted to make the switch to the HOV when a car already in the HOV was approaching fast, my car was already half way into the HOV lane, still moving ~30km/h and the car behind was coming up fast and close. I did not want to risk getting rear-ended, so I took over, moved quickly into the HOV lane and accelerated quickly to stay ahead of the car. I know AP would not make such a sudden move and acceleration I did, but it might have aborted the lane change and moved back into the original lane -- which I have seen it do several times in other occasions -- but I did not feel safe to wait any longer for it. Anyway that kind of hesitant driving (moving half-way into another lane then aborting the lane change) is a very bad habbit that is confusing for other drivers around and therefore potentially dangerous.
 
  • Informative
Reactions: Jeff Hudson
Maybe, but we really need data. Parking lot accidents are 1/5 of all accidents and have high pedestrian injury rates. People are really bad about paying attention in parking lots. I can't find a solid number but it's quoted as "tens of thousands annually".

Why hundreds are killed in crashes in parking lots and garages every year


Self reported poll so the real numbers are probably worse.
Parking Lot Safety
In an NSC public opinion poll, 66% of drivers nationwide said they would make phone calls while driving through parking lots. Respondents also said they would:

  • Program GPS systems (63%)
  • Text (56%)
  • Use social media (52%)
  • Send or receive emails (50%)
  • Take photos or watch videos (49%)
NSC found teens (59%) were more likely to engage in personal grooming than adults (53%) while driving in parking lots, but less likely to be on the phone (60% vs. 66%).

During the hectic holiday season, drivers and pedestrians also are likely to be distracted by extensive to-do lists and are hurriedly trying to get from one place to another.

Summon wind up being like autopilot where it's objectively safer but people with an agenda or lack of brain cells will draw undue attention to them. We also need to see if it's other dumb people hitting the Tesla cars.
Edit, 50k crashes a year with 500 fatalities and 60k injuries.

50k parking lot crashes annually with 200M drivers is one every 4000 years, which is two orders of magnitude lower than my estimate. It can't be that low though...

Small accidents happen about once a year - or once 12k miles. A large % of them happen in the parking lots.

If SS is more cautious than avg humans - it will greatly reduce at fault accidents for owners.

BTW, in general newer cars should have fewer accidents. Backup camera and other sensors will greatly help. The accident we saw on twitter would not have happened if the other car had backup camera or other sensors.

Nocturnal estimated 1 in 5. So one parking lot accident every 5 years is an order of magnitude higher than my estimate, implying the accident rate for smart summon is not alarming.

I'm pretty sure that's way too low: number of car insurance claims is around 170m per year in the U.S., but there's three strong filters that significantly reduce filings for minor fender-benders:
  • deductibles and the risk of higher rates with an own-fault claim on record causes people to not file a claim but pay in cash,
  • uninsured for own-damage might not file a claim but pay in cash,
  • there's also a substantial percentage of hit-and-run, both intentional and accidental.
I wouldn't be surprised if the real figure was twice as high, with a lot of the cash settled fender-benders in parking lots.

With ~250m registered vehicles in the U.S., this suggests that the rate of minor fender-benders is closer to once per year.

If parking lot accidents are 1/5 of those, that implies every 5 years as well, so also an order of magnitude higher than my estimate. I used 30 years as a SWAG because I haven't gotten into a collision in a parking lot, but I should have looked it up. Maybe a dozen+ accidents a day if SS is relatively safe, if SS is used 10% of the time.

The point being that Tesla will know whether it needs to do something if SS causes more accidents than humans would.

Also, I've heard that SS is set to 5 mph. IMO that's much too fast for autonomy at this stage. A max speed of 2 or 3 mph would give twice the time to react to situations, making SS orders of magnitude safer to supervise. Tesla should have a setting that sets max SS speed that defaults to something slower, which is equivalent to the following distance on AP.
 
And now for something completely different...

Everyone lately has seemed focused on the obvious news this week--upcoming Q3 delivery numbers, V10 headliner features eg Smart Summon, Netflix, Spotify. But v10 brought several seemingly minor under-the-hood Autopilot improvements that I believe are being largely overlooked but have significant impact to the prowess and enjoyment of the system.

For context, I received v10 on my 3 Friday night (along with a bugfix update Saturday morning). I then took part in an autumn mountain drive with our local club on Sunday (side note: close to 60 Teslas participated--it really drove home how the base of vehicles has grown lately). This drive was around 200 miles, through everything from city streets to interstate highways to curvy mountain roads. Several things jumped out to me:
  • Hand detection on the wheel is much improved in my car, given my hand placement. The car nags more quickly if hands are not on the wheel, but when my hand is on the wheel, the car now detects it virtually 100% of the time, even when that hand is exerting no artificial torque on the wheel. I just have my hand resting there like I prefer to do, either on the side or the bottom. This is a very marked change, and much for the better IMO. It will make it harder for people to be dumb and have hands off for any significant period, while simultaneously making it much less frustrating to keep the car happy through normal means.
  • Phantom braking seems greatly reduced on this release. I've only driven maybe 100 miles on AP since the update so there's a chance that I'm overstating this, but I experienced only one significant phantom brake, and it was entering a construction zone.
  • The lane change visualization improvement is nicer than I had expected. The old system wasn't particularly calling out for improvement, but it's nice to now see exactly how your car fits into the other lane's traffic when the change begins.
  • Additional vehicle types (eg pickup truck) are depicted in the visualization. Specific lane line types are shown (eg double-yellow, normal solid line, dashed passing line). The 'dancing cars' are now virtually gone at stops. And the ability to manipulate the visualization is neat. These are each very small improvements but together they raise the confidence level in what the car is seeing.
In short, v10 is a significant release on the AP front, even discounting the inclusion of Smart Summon (which is cool but largely a party trick at present--needs a lot of work before it's going to be doing anything resembling FSD in a way that doesn't have the owner on edge / cringing at failed parking lot etiquette like pulling into spaces sideways, using bidirectional space however it desires, etc). The overall experience of driving on AP is significantly improved through the implementation of these small changes. As an owner, I'm happy. As an investor, I'm pleased.

While I agree that the points you listed are nice improvements, I have tested NoA on my usual morning route and it still makes the same mistakes as before and one of them was a particularly dangerous one that may have caused me an accident if I had not intervened. More details in the following post (as its OT here):
Autopilot disengagement, driving etc. out of main.
 
Also, I've heard that SS is set to 5 mph. IMO that's much too fast for autonomy at this stage. A max speed of 2 or 3 mph would give twice the time to react to situations, making SS orders of magnitude safer to supervise. Tesla should have a setting that sets max SS speed that defaults to something slower, which is equivalent to the following distance on AP.

2-3 mph is the walking speed of an average person. I would be pissed if a car in front of me was going that slowly when there’s clearly an open lane to drive. People drive faster than that everyday in parking lots.
 
50k parking lot crashes annually with 200M drivers is one every 4000 years, which is two orders of magnitude lower than my estimate. It can't be that low though...
Yeah, I had that thought as well. Perhaps it's concentrated in a relatively smaller number of drivers. 6 million accidents a year, but if only 50k are in parking lots that's a really low number.
 
2-3 mph is the walking speed of an average person. I would be pissed if a car in front of me was going that slowly when there’s clearly an open lane to drive. People drive faster than that everyday in parking lots.

At this stage I don't think people should be using SS in parking lots if there are cars all over. The summoner can also stop the car and let trailing cars drive around. Put this another way:

If I want to drive a car on roads shared with others, I need to pass a test to get my driver's license.

If I want to fly a drone around people, I need to pass an exam to certify myself as a remote pilot.

If I'm supposed to remotely supervise a car which is navigating through a parking lot with pedestrians, pets, and vehicles... I don't need any training at all?

Tesla says we must maintain line of sight and supervise SS, which means it's not completely reliable or safe. That implies the person supervising should have some training or the feature should be as timid as possible until the supervisor gains enough expertise.

If the supervisor doesn't know how to change the default from 3 mph to 5 mph, or how to stop SS to let other cars pass, they shouldn't be using SS at 5 mph either.

Edit: My reason for my SS concerns is very much investment related. If you feel SS is already as good as or better than a human driver, ignore my concerns. If not and SS causes a fatality or severe injury, I don't even know what would happen to Tesla, Elon, or the SP.

Saying that a human is must supervise is no defense, when the "supervisor" has not been trained or certified for remotely operating or supervising vehicles. Btw, untrained supervisors (ie mothers with children, guys showing off the car to friends) get distracted too.
 
Last edited:
Does anyone have an idea if smart summon utilizes HW3 properly? I'm wondering if performance is better for those folks than the 2.5 guys like myself.

Up till now, all crashes in parking lots have been human's fault. Humans are neither safe nor reliable.

At least a human using summon has a computer dedicated to avoid crashing.
And considering the 60k injuries and 500 fatalities a year, even if the crash rate increases we might see lower odds of injuries.
 
  • Love
Reactions: StealthP3D
At this stage I don't think people should be using SS in parking lots if there are cars all over. The summoner can also stop the car and let trailing cars drive around. Put this another way:

If I want to drive a car on roads shared with others, I need to pass a test to get my driver's license.

If I want to fly a drone around people, I need to pass an exam to certify myself as a remote pilot.

If I'm supposed to remotely supervise a car which is navigating through a parking lot with pedestrians, pets, and vehicles... I don't need any training at all?

Tesla says we must maintain line of sight and supervise SS, which means it's not completely reliable or safe. That implies the person supervising should have some training or the feature should be as timid as possible until the supervisor gains enough expertise.

If the supervisor doesn't know how to change the default from 3 mph to 5 mph, or how to stop SS to let other cars pass, they shouldn't be using SS at 5 mph either.
Up till now, all crashes in parking lots have been human's fault. Humans are neither safe nor reliable.

At least a human using summon has a computer dedicated to avoid crashing.
 
Edit: My reason for my SS concerns is very much investment related. If you feel SS is already as good as or better than a human driver, ignore my concerns. If not and SS causes a fatality or severe injury, I don't even know what would happen .
SS is not more or a concern than AP - esp given very low speeds.

Saying that a human is supposed to supervise is no defense, when the supervisor has not been trained or certified for remotely operating or supervising vehicles.
Think about all the training people get to supervise kids ;)
 
  • Informative
Reactions: Artful Dodger
I kinda liked the idea of having the flashers on when doing smart summon. I think it would alert all around that something is going on that is different than the current normal.
They should also add the "horn" button to the smart summon screen. Since we are supposed to maintain line of sight the whole time, having your finger on the trigger to warn "blind spot Bob" he's about to eat his deductible could be worthwhile.
 
They should also add the "horn" button to the smart summon screen. Since we are supposed to maintain line of sight the whole time, having your finger on the trigger to warn "blind spot Bob" he's about to eat his deductible could be worthwhile.

Yeah actually that feature would be very useful. Especially if the car would was detecting objects around it and "sensed" that a collision was imminent and then gave you an alert so that you can hit the horn button
 
SS is not more or a concern than AP - esp given very low speeds.


Think about all the training people get to supervise kids ;)

You monitor AP in first person. You monitor SS in third person. Totally different, which is why people crash drones so often.

I have been run into plenty of times by little kids "supervised" by parents in the store. Fortunately, they didn't break my leg or kill me. Can't say the same about a 2 ton car running into me at 5 mph.
 
Up till now, all crashes in parking lots have been human's fault. Humans are neither safe nor reliable.

At least a human using summon has a computer dedicated to avoid crashing.

If you believe that SS drives as well as a human, this doesn't apply. However, I saw an article that showed links to various crashes, one of which SS crashed into the garage. Another was a narrow miss that would have been the fault of Tesla had the summoner not stopped the car in time (barely).

Regardless, I don't recall ever seeing operation of a vehicle that could cause grave injury in public spaces, to not require training. E.g. Do people operate forklifts in stores full of people without someone training them first?

A driver's license is first person driving, which is not the same as remote operation of a vehicle. Even in a plane, visual flight rules are very different from instrument flight rules, and neither is quite the same as remote line of sight control.
 
I drove ~180 miles on AP today with the latest firmware (2019.32.11). It would have been with confirmation turned off, but even though I enabled that it wasn't enabled when I drove.

I44 west of St. Louis has a long stretch where -- after several months now -- they still haven't bothered to paint lines for west bound traffic (though they finally have for east bound).

Old history: AP would behave normally, including offering a lane change when relevant, but abort and exit AP as soon as you said to do so. AP could not be enabled again due to lack of lane lines.

Recent history: AP would behave normally and could successfully change lanes when directed to. If for any reason AP was aborted it could not be enabled again due to lack of lane lines.

Today: AP would behave normally, including offering a lane change when relevant, but abort and exit AP as soon as you said to do so. Lane keeping was tolerable, but not as good as it used to be. At times, the red hands-on-the-wheel prompt would appear several times a second which was rather annoying. However, nearly all of the time AP could be re-enabled when in the right hand lane (though not in the left lane).

Verdict: This is regressive in terms of capabilities (cannot change lanes, ability to keep lane). It is also more cautious (red nags). This suggests that the underlying code has changed significantly rather than being iterative. I strongly suspect that it is geared for HW3. The nags could be a reflection of lower confidence on the slower hardware, but it could also reflect a greater (more refined) awareness so it knows to nag the user because there are no lane lines.

Although there were multiple lane encroachments from semis, only once did my car react and move away from it. Still, even once is more than it would do in previous versions and that is a much welcomed change.

Taking exit ramps was better. One was around 20mph and Tesla did it. The approach to the exit was slower than warranted to the point that it invited other traffic to cut in front (which can be dangerous), but in all it was a definite improvement and the first time I've not had to take over in a clover leaf.

Taking an exit that split into three lanes where I needed the left most I was tugging to turn toward the left which might be why it happened, but I was pleased that AP didn't abort. What I would really like would be it to recognize turn signals and pick lanes accordingly -- sometime in the future I suppose.

Multiple times AP braked due to caution in situations where I do not believe it used to and I would not have. Nor, I suspect, would any but the most cautious of drivers. Still, the braking was slight and caused no issue. The speed drop was slight and immediately regained. It is always possible that the braking was done deliberately to attract driver attention, but that is likely reading too much into it.

I've read someone else describing the following experience, but this was a first for me. I was needing to merge right and trailing traffic was a bit close. The car started the lane change, swung back, and then did the lane change. This was different than the usual aborted lane changes -- for starters because it completed it. I immediately thought of the previously proffered explanation: namely, that it did the first approach to encourage the trailing car to back off.

I'm not sure whether the behavior is actually an expected part of the programming, but it was definitely distinct from aborted lane changes -- like one I experienced on a previous trip where the Tesla had started the lane change and a car behind me ignored my right of way and forced into the lane anyway. In that case my Tesla did the right thing: it immediately aborted the lane change and returned to its lane of travel.

Although I was unable to drive with confirmation turned off, I did attempt to always confirm when prompted. I didn't only a few times, the majority was how the Tesla wanted to drive. An example where I did not confirm was a faster car approaching from behind. They were distant enough that I know most drivers would have pulled out to pass regardless (I know, because I see this constantly on the interstate), but for me it is part of driving etiquette. Another example would be changing to the left lane for traffic stopped on the right shoulder.

Tesla should really do this. Sure, the car has good reflexes, but earlier this year I passed an accident where a pickup did not do so and a semi pulled in front of him. I didn't see it as it happened so I have no idea about the particulars of the circumstance, but not matter how safe your car is you don't want to slam into what amounts to a stationary semi while going interstate speed. It is also the law in some states.

All in all v10 didn't seem that different to me from prior versions. I already highlighted the notable differences which represented a small fraction of the drive, the rest was just business as usual.
 
  • Informative
Reactions: ZsoZso