Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autopilot disengagement, driving etc. out of main.

This site may earn commission on affiliate links.
I'm curious, for those of you on the latest no-confirm navigate on autopilot: what's your miles-per-intervention rate on it? I'm wondering what our baseline is, before going from HW2 to HW3.

Anecdotally I've seen reports of ~50-mile commute legs on highways going 100% intervention free, and most of the interventions I've seen reported are actually voluntary, for cases where NoA is too timid in higher density traffic.

In principle HW3 might already be coast-to-coast capable... (!)
 
Anecdotally I've seen reports of ~50-mile commute legs on highways going 100% intervention free, and most of the interventions I've seen reported are actually voluntary, for cases where NoA is too timid in higher density traffic.

In principle HW3 might already be coast-to-coast capable... (!)

"Optional" interventions of course don't count ;) But ~50 miles per intervention wouldn't actually be that impressive. Better than AP used to be, sure, but...
 
"Optional" interventions of course don't count ;) But ~50 miles per intervention wouldn't actually be that impressive.

It was only stopped because it's a ~50 miles commute leg, on-ramp, off-ramp. Since it was 100% intervention-free, in 30 days it accumulates to a 1,500 miles of intervention free NoA track record on that route. :D
 
I'm curious, for those of you on the latest no-confirm navigate on autopilot: what's your miles-per-intervention rate on it? I'm wondering what our baseline is, before going from HW2 to HW3.
Haven't really kept track but if I had to guess it would probably be somewhere around 20-25. That might sound bad but almost all of those are to correct overly conservative actions on the part of the system. Tentative to change lanes, slowing down to initiate passes, etc. I have only had one instance where if I had not taken over it would not have stayed on my route. It was a very strange circumstance where a transition lane from one highway to another is limited with polls a long way before the actual transition. The car was caught in an outside lane and couldn't move over in time.

Interestingly, I have never had any interventions due to safety concerns. The system is pretty amazing and in the short time I have had it, I have noticed improvements to its functionality and confidence.

Dan
 
Haven't really kept track but if I had to guess it would probably be somewhere around 20-25. That might sound bad but almost all of those are to correct overly conservative actions on the part of the system. Tentative to change lanes, slowing down to initiate passes, etc. I have only had one instance where if I had not taken over it would not have stayed on my route. It was a very strange circumstance where a transition lane from one highway to another is limited with polls a long way before the actual transition. The car was caught in an outside lane and couldn't move over in time.

Interestingly, I have never had any interventions due to safety concerns. The system is pretty amazing and in the short time I have had it, I have noticed improvements to its functionality and confidence.

Dan

Roughly how many miles have you done, do you think?
 
Roughly how many miles have you done, do you think?
I try to use it every day if I can. I even make up destinations just to try to get miles on it! If I had to guess I would say around 500 miles or so. I will be taking a 900 mile road trip in June and a 1000 mile trip in July. That should be more telling.

Dan
 
I'm trying to read through the Waymo disengagement reports to determine what they classify as a "disengagement" (e.g. whether only safety-critical things count, or whether the driver overriding the car for non-safety-related reasons counts). But beyond the CA DMV site being down (had to use a cached version), the data is beyond useless. The vast majority of reports simply state, "Disengage for unwanted maneuver of the vehicle that was undesirable under the circumstances." The next most common are "Disengage for a perception discrepancy for which a component of the vehicle's perception system failed to detect an object correctly."

I know that unless absolutely necessary, Waymo just lets its cars "drive poorly" - e.g. they're allowed to take ages to merge, they're allowed to miss turns, etc - drivers generally only intervene either if the car is going to get in an accident, or if it's basically gotten itself into a situation that it doesn't know how to get out of.

ED: It may be clearer if I could see a non-text-only version of the report; this text-only version makes it hard to see whether the vehicle or the operator initiated the disengagements.
 
Anecdotally I've seen reports of ~50-mile commute legs on highways going 100% intervention free, and most of the interventions I've seen reported are actually voluntary, for cases where NoA is too timid in higher density traffic.

In principle HW3 might already be coast-to-coast capable... (!)
Yes, I'm mostly driving on low/moderate congestion highways with fairly simple interchanges. Recently, I have not had to cancel a lane change initiated by the car for at least the last 7 to 8 highway drives. I typically take over if I'm impatient and want to get something done rapidly, often in a more stealth fashion than putting on the blinker and doing a standard lane change. I typically only drive about 20 minutes on divided highways a few times per week. I am intervening out of impatience maybe once every 3rd drive. I have not tried mad max yet. My understanding is that does not speed up the lane changes it just makes them more frequent. My impression is that as long as people are willing to chill out and be more patient, the current NoA works quite well for moderate congestion highways. My trust factor is high.

My wife frequently drives her X down to Milwaukee. Her feedback is that it is too timid there and even seems to get confused in high congestion. She takes over quite a bit there but not on the majority of the drive that occurs outside of Milwaukee. She uses mad max. Bottom line is she loves NoA for the majority of the the drive that occurs in light or moderate congestion. It needs to improve in dense city highway driving, becoming more aggressive.
 
Haven't really kept track but if I had to guess it would probably be somewhere around 20-25. That might sound bad but almost all of those are to correct overly conservative actions on the part of the system. Tentative to change lanes, slowing down to initiate passes, etc. I have only had one instance where if I had not taken over it would not have stayed on my route. It was a very strange circumstance where a transition lane from one highway to another is limited with polls a long way before the actual transition. The car was caught in an outside lane and couldn't move over in time.

Interestingly, I have never had any interventions due to safety concerns. The system is pretty amazing and in the short time I have had it, I have noticed improvements to its functionality and confidence.

Dan
That's definitely our experience as well. Interventions are probably now 95%+ for overly conservative actions rather than incorrect decisions/actions.
 
It needs to improve in dense city highway driving, becoming more aggressive.

May I suggest that it actually needs to become more friendly? The secret to being admitted into a traffic queue is window down, a smile and a grateful wave. The autonomous equivalent I guess is a cheery R2D2 like melody on the horn or some kind of cute light show.
 
Haven't really kept track but if I had to guess it would probably be somewhere around 20-25. That might sound bad but almost all of those are to correct overly conservative actions on the part of the system. Tentative to change lanes, slowing down to initiate passes, etc. I have only had one instance where if I had not taken over it would not have stayed on my route. It was a very strange circumstance where a transition lane from one highway to another is limited with polls a long way before the actual transition. The car was caught in an outside lane and couldn't move over in time.

Interestingly, I have never had any interventions due to safety concerns. The system is pretty amazing and in the short time I have had it, I have noticed improvements to its functionality and confidence.

Dan
My wife drives her X a lot. We bought it last July and she has 35,000 miles on it, so she has a lot of experience with AP and NoA. I just checked with her again about how its doing. She said she recently stopped using NoA because of odd decision making for lane changes. For instance, the car may be in the left lane coming up on a car. It changed lanes to the right lane into a tight spot between 2 cars. That pushed the rear car back, who then quickly changed lanes to the left lane. The left lane then started moving quicker and the X changed back to the left lane. I asked her how often she has intervened due to the car looking like it was going to take a wrong ramp or something serious. She said none. I'm thinking mad max is partly to blame so I'm going to have her switch to standard.
 
Reading more about how "disengagements" are tallied. Apparently the CA definition is:

“a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.”

So either the vehicle disengages on its own, or the driver has to take over for safety (not convenience).

Any rough estimates as to what Tesla might be on the latest version with HW2, under this definition? 1 in 200mi? 1 in 1000 mi? 1 in 10000 mi? More? I'm having a hard time grasping this specifically, separating that from these "disengagements of convenience", like more aggressive lane changes or not missing turns.
 
  • Informative
Reactions: shlokavica22
I'm trying to read through the Waymo disengagement reports to determine what they classify as a "disengagement" (e.g. whether only safety-critical things count, or whether the driver overriding the car for non-safety-related reasons counts). But beyond the CA DMV site being down (had to use a cached version), the data is beyond useless. The vast majority of reports simply state, "Disengage for unwanted maneuver of the vehicle that was undesirable under the circumstances." The next most common are "Disengage for a perception discrepancy for which a component of the vehicle's perception system failed to detect an object correctly."

I know that unless absolutely necessary, Waymo just lets its cars "drive poorly" - e.g. they're allowed to take ages to merge, they're allowed to miss turns, etc - drivers generally only intervene either if the car is going to get in an accident, or if it's basically gotten itself into a situation that it doesn't know how to get out of.

ED: It may be clearer if I could see a non-text-only version of the report; this text-only version makes it hard to see whether the vehicle or the operator initiated the disengagements.
Well then, if you use Waymo's standards then I have had 0 interventions. If stupid and obnoxious are still ok then Tesla's autopilot is the leader by far! Seriously though, I have had no interventions whatsoever for the reasons you state in their documentation.

Dan
 
Well then, if you use Waymo's standards then I have had 0 interventions. If stupid and obnoxious are still ok then Tesla's autopilot is the leader by far! Seriously though, I have had no interventions whatsoever for the reasons you state in their documentation.

Dan

Heh, you'd have stiff competition on your hands on the "stupid and obnoxious front" from Waymo. ;) Their cars are famous for painfully slow and awkward lane changes.

Here's an investigative report on the quality of Waymo's driving (3 days, 170 miles worth):


ED: Just noticing in the comments section:

"Is it now? Did they actually get binoculars to see if a human was controlling the steering wheel or not? Because 99.9999% of the time I ever saw a Waymo car (actually I think 100% of the time), it was being driven by the safety driver. It's almost like they have so little faith in their technology they almost never actually let it run."
 
Last edited:
Reading more about how "disengagements" are tallied. Apparently the CA definition is:

“a deactivation of the autonomous mode when a failure of the autonomous technology is detected or when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.”

So either the vehicle disengages on its own, or the driver has to take over for safety (not convenience).

Any rough estimates as to what Tesla might be on the latest version with HW2, under this definition? 1 in 200mi? 1 in 1000 mi? 1 in 10000 mi? More? I'm having a hard time grasping this specifically, separating that from these "disengagements of convenience", like more aggressive lane changes or not missing turns.
Disengagements for failure or safety reasons are negligible in our experience. None for me or my wife for months. For regular Autopilot using TACC, lane keeping, and initiated lane changing I would say our's has been much less than 1 in 1,000 miles. For NoA, perhaps 1 in 1,000, probably less. My wife just mentioned to me that her AP just deactivated on her current drive to Milwaukee due to light rain. I need to get that spray for the front sensors. Weather remains a serious issue for AP here in WI.
 
While I loved my P90D, I think the whole conversation amongst car companies (not just Tesla) is a bunch of propeller heads saying how incredible FSD will be, and it being "right around the corner". Has anyone sat back in their chair and thought about the practicality of FSD? You can't use it on major highways in gridlock traffic or in large cities, as it will try to maintain a good, safe distance from the car in front of it which means other drivers and taxi cabs will jump into any opening. You will be lucky to move. Going cross town in mid town Manhattan during daylight hours? Hahahahaha!!!!!

I think that this a poster child case of a zoomy technical solution searching for a practical problem to solve. And that includes all car companies in addition to Waymo, Google and whoever else has thrown their hat in the ring. Let me be clear -- I'm not bashing Tesla -- I'm calling into question the whole topic of autonomous FSD cars from ANY company. All people want to talk about is the technology, not the fundamental value proposition behind it.

Not correct, in traffic queues, EAP keeps pretty close to the car in front.

The only issue it has is that it stays in the centre of the lanes and if there's a motorbike coming through you'll need to disengage and move over to the side to give space - at least on European highways.