Both of those articles are nearly a year old. One would hope Waymo has improved since then.
I'm confident Waymo's disengagement is much higher than those reports. Biggest problem for Waymo that I've heard is that the software was / is too tentative. Difficulty in congested unprotected left hand turns and merging. If there was a disengagement because the car wasn't doing anything then that validates what I've heard. From reading reports the navigation software still avoids left hand turns and will take a circuitous route when there is a passenger. I wouldn't be surprised when they launch a public service if it still does that.
- Was the disengagement serious? Was the safety driver just being paranoid?
- There is a lot of ebb and sway in driving. We can call it dangerous but would not necessarily cause an accident.
I think its alot clear to people that their safety disengagement in phoenix is in the hundreds of thousands.
For example we know from Lex Fridman's study (a huge fan of Tesla and elon) that Tesla AP has one tricky disengagement every 9.2 miles. Here's what Lex called "tricky".
"We focus on “tricky situations”, a term that refers to scenarios that, if not attended to, may lead to property damage, injury, or death. We use this measure to evaluate 8,682 Autopilot disengagements in response to tricky situations"
Clearly Lex defines tricky situations as safety related disengagements that if human doesn't take over would lead to an accident. This is the same thing Waymo uses for its safety disengagements.
Now a Tesla fan would like to conflate the two and say that Waymo's disengagement rate is the same with Tesla in an attempt to minimize what Waymo is doing and to prop up Tesla. But whatever number they use, "50" or whatever. That number as you know were NOT disengagements to AVOID an accident or to avoid unsafe action or behavior. They were not safety related disengagement. While a majority of autopilot disengagements are safety disengagement. Some are also not related to safety. We of-course see a study that shows that there is a safety related disengagement every 9.2 miles.
How Waymo defines Safety related Disengagement
To help evaluate the significance of driver disengagements, we employ a powerful simulator program -- developed in-house by our engineers -- that allows the team to “replay” each incident and predict the behavior of the self-driving car (had the driver not taken control of it) as well as the behavior and positions of other road users in the vicinity (such as pedestrians, cyclists, and other vehicles). The simulator can also create thousands of variations on that core event so we can evaluate what would have happened under slightly different circumstances, such as our vehicle and other road users moving at different times, speeds, and angles. Through this process we can determine the events that have safety significance and should receive prompt and thorough attention from our engineers in resolving them. In the reporting period, there were 69 events across our fleet in which safe operation of the vehicle required disengagement by the driver.
Of the 69 reportable safe operation events, 13 were “simulated contacts” -- events in which, upon replaying the event in our simulator, we determined that the test driver prevented our vehicle from making contact with another object. The remaining 56 of the 69 events were safety-significant because, under simulation, we identified some aspect of the SDC’s behavior that could be a potential cause of contacts in other environments or situations if not addressed. This includes proper perception of traffic lights, yielding properly to pedestrians and cyclists, and violations of traffic laws.
To be clear, however, these 56 events during the reporting period would very likely not have resulted in a real-world contact if the test driver had not taken over. In 10 of the 13 simulated contact events, the SDC’s predicted behavior would have, in simulation, caused contact (though 2 of these involved simulated contact with traffic cones). In 3 of the 13 occasions, a driver in another vehicle made a move that would have, in simulation, caused a contact with our car (e.g., in one case the other vehicle was driving the wrong way down the road in the SDC’s path); in these cases, we believe a human driver could have taken a reasonable action to avoid the contact but the simulation indicated the SDC would not have taken that action.
Waymo has already detailed their safety disengagement and how they run every single disengagement through a simulator to see what would have happened.
For Waymo, their criteria is actually more STRICT because they not only replay the events to see if the Waymo would have done anything unsafe but they also randomize with "different circumstances, such as our vehicle and other road users moving at different times, speeds, and angles." to prove that no similar with different scenario would lead to a unsafe action. If it did then they would count it as a safety disengagement.
For autopilot we already know because most of the disengagements are obvious fails when you see them. Take for example:
Do you think that Waymo and Lex would consider all the disengagement here as safety related? Ofcourse
Now here's a recent disengagement from a waymo one rider for example.
Solely viewing it from an autonomous vehicle perspective, I still don't mind having a safety driver present. Last night, I decided to take a 1.5 mile trip to dinner with an unprotected left turn. The turn is across three lanes and a turn lane. It was rush hour with a blinking yellow. Dark, but otherwise perfect weather. We needed manual mode to complete the turn. At the end of the otherwise perfect trip, I left a failing score for the entire trip and noted the turn with the strongest negative option available. I figure that, if they are planning to start operating truly driver-less, it's required for manual mode to only ever be needed by the car for logistical outliers (like a blocked lane or a closed street).
So here you see a disengagement where they were sitting on a un-protected left turn lane in dense rush hour traffic. Instead of the safety driver waiting for a long time. They decided to just make the turn themselves. Again that's NOT safety related at all.
This is similar to someone on autopilot who is sitting behind a parked car and since AP won't overtake, they take over to autotake themselves. That's not safety related. Or if NOA will miss their exit and they take over to take the exit or if they are on regular AP and decide they want to leave the highway. Again not safety related. This won't count in Lex "trick disengagement" or in Waymo's simulator.
So then why compare non-safety disengagement of waymo with obvious safety disengagement of Tesla? Rather than safety disengagement versus safety disengagement?
Its because Waymo's safety disengagement is clearly in the 100k+ range and Tesla's is clearly in the tens.
Last edited: