Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

The Verge rides in a Waymo robotaxi with no safety driver

This site may earn commission on affiliate links.
Both of those articles are nearly a year old. One would hope Waymo has improved since then.
  1. Was the disengagement serious? Was the safety driver just being paranoid?
  2. There is a lot of ebb and sway in driving. We can call it dangerous but would not necessarily cause an accident.
I'm confident Waymo's disengagement is much higher than those reports. Biggest problem for Waymo that I've heard is that the software was / is too tentative. Difficulty in congested unprotected left hand turns and merging. If there was a disengagement because the car wasn't doing anything then that validates what I've heard. From reading reports the navigation software still avoids left hand turns and will take a circuitous route when there is a passenger. I wouldn't be surprised when they launch a public service if it still does that.

I think its alot clear to people that their safety disengagement in phoenix is in the hundreds of thousands.

For example we know from Lex Fridman's study (a huge fan of Tesla and elon) that Tesla AP has one tricky disengagement every 9.2 miles. Here's what Lex called "tricky".

"We focus on “tricky situations”, a term that refers to scenarios that, if not attended to, may lead to property damage, injury, or death. We use this measure to evaluate 8,682 Autopilot disengagements in response to tricky situations"

Clearly Lex defines tricky situations as safety related disengagements that if human doesn't take over would lead to an accident. This is the same thing Waymo uses for its safety disengagements.

Now a Tesla fan would like to conflate the two and say that Waymo's disengagement rate is the same with Tesla in an attempt to minimize what Waymo is doing and to prop up Tesla. But whatever number they use, "50" or whatever. That number as you know were NOT disengagements to AVOID an accident or to avoid unsafe action or behavior. They were not safety related disengagement. While a majority of autopilot disengagements are safety disengagement. Some are also not related to safety. We of-course see a study that shows that there is a safety related disengagement every 9.2 miles.

How Waymo defines Safety related Disengagement

To help evaluate the significance of driver disengagements, we employ a powerful simulator program -- developed in-house by our engineers -- that allows the team to “replay” each incident and predict the behavior of the self-driving car (had the driver not taken control of it) as well as the behavior and positions of other road users in the vicinity (such as pedestrians, cyclists, and other vehicles). The simulator can also create thousands of variations on that core event so we can evaluate what would have happened under slightly different circumstances, such as our vehicle and other road users moving at different times, speeds, and angles. Through this process we can determine the events that have safety significance and should receive prompt and thorough attention from our engineers in resolving them. In the reporting period, there were 69 events across our fleet in which safe operation of the vehicle required disengagement by the driver.

Of the 69 reportable safe operation events, 13 were “simulated contacts” -- events in which, upon replaying the event in our simulator, we determined that the test driver prevented our vehicle from making contact with another object. The remaining 56 of the 69 events were safety-significant because, under simulation, we identified some aspect of the SDC’s behavior that could be a potential cause of contacts in other environments or situations if not addressed. This includes proper perception of traffic lights, yielding properly to pedestrians and cyclists, and violations of traffic laws.

To be clear, however, these 56 events during the reporting period would very likely not have resulted in a real-world contact if the test driver had not taken over. In 10 of the 13 simulated contact events, the SDC’s predicted behavior would have, in simulation, caused contact (though 2 of these involved simulated contact with traffic cones). In 3 of the 13 occasions, a driver in another vehicle made a move that would have, in simulation, caused a contact with our car (e.g., in one case the other vehicle was driving the wrong way down the road in the SDC’s path); in these cases, we believe a human driver could have taken a reasonable action to avoid the contact but the simulation indicated the SDC would not have taken that action.

Waymo has already detailed their safety disengagement and how they run every single disengagement through a simulator to see what would have happened.

For Waymo, their criteria is actually more STRICT because they not only replay the events to see if the Waymo would have done anything unsafe but they also randomize with "different circumstances, such as our vehicle and other road users moving at different times, speeds, and angles." to prove that no similar with different scenario would lead to a unsafe action. If it did then they would count it as a safety disengagement.

For autopilot we already know because most of the disengagements are obvious fails when you see them. Take for example:

Do you think that Waymo and Lex would consider all the disengagement here as safety related? Ofcourse

Now here's a recent disengagement from a waymo one rider for example.

Solely viewing it from an autonomous vehicle perspective, I still don't mind having a safety driver present. Last night, I decided to take a 1.5 mile trip to dinner with an unprotected left turn. The turn is across three lanes and a turn lane. It was rush hour with a blinking yellow. Dark, but otherwise perfect weather. We needed manual mode to complete the turn. At the end of the otherwise perfect trip, I left a failing score for the entire trip and noted the turn with the strongest negative option available. I figure that, if they are planning to start operating truly driver-less, it's required for manual mode to only ever be needed by the car for logistical outliers (like a blocked lane or a closed street).

So here you see a disengagement where they were sitting on a un-protected left turn lane in dense rush hour traffic. Instead of the safety driver waiting for a long time. They decided to just make the turn themselves. Again that's NOT safety related at all.

This is similar to someone on autopilot who is sitting behind a parked car and since AP won't overtake, they take over to autotake themselves. That's not safety related. Or if NOA will miss their exit and they take over to take the exit or if they are on regular AP and decide they want to leave the highway. Again not safety related. This won't count in Lex "trick disengagement" or in Waymo's simulator.

So then why compare non-safety disengagement of waymo with obvious safety disengagement of Tesla? Rather than safety disengagement versus safety disengagement?

Its because Waymo's safety disengagement is clearly in the 100k+ range and Tesla's is clearly in the tens.
 
Last edited:
Here's how I see it. Tesla is more likely to make money on FSD on a per-car basis than Waymo in the near future. Even if a lot of it is deferred revenue. People are willing to pay for basically an advanced driver assistance feature with a questionable timeframe for level 3+ .

How much money does it take for Waymo to build these cars? Then they pretty much work in very narrow environments (Tesla has the same issue).

Ultimately FSD is more likely to be a profit center for Tesla as a Level 2 feature set than Waymo building Uber's that require a huge investment in equipment and mapping.
 
  • Like
Reactions: APotatoGod
This requires that Tesla be able to deliver on FSD. Most in the industry think it is on the verge of impossible (with any near term technology) to deliver on FSD with the sensor suite currently in Tesla cars. Some day it may become possible to do that, but there is no sign of it being imminent. If Tesla changes that, and invents the breakthrough necessary for that, they will indeed profit well. If they don't then they are a very distant finisher in the game. If somebody other than Tesla makes the breakthrough, it will revolve around whether Tesla can learn from that, or buy that. The seller will of course demand a high price.
 
Welcome to TMC, Brad! It's great to have you here. :) I enjoy reading your work in Forbes.

What do you think of Mobileye's “camera-centric approach”? I recently heard Amnon Shashua explain this idea in this video starting at 48:30 and ending at 51:30.


Personally, I think if Tesla has a “breakthrough”, it's large-scale fleet learning. I put “breakthrough” in quotes because this isn't a new technological invention or a scientific discovery; it's just a different approach enabled by scale (and the ability and willingness to leverage scale for deep learning).

I think Tesla can always combine large-scale fleet learning and lidar if it finds itself lagging competitors in launching profitable, scalable robotaxi services and if it's not making progress fast enough to catch up. This would involve launching test vehicles at a small scale, just as Waymo, Cruise, Zoox, and others have done.

The test vehicles would benefit from all the fruits of Tesla's large-scale fleet learning. Then Tesla could add expensive, high-grade lidar and work on lidar perception and sensor fusion. Maybe also lidar HD maps. This would take time, but combining the large-scale and small-scale approaches together could be powerful and allow Tesla to move fast enough to catch up.

If/when robotaxi-level autonomy is solved using this combined approach, Tesla can retrofit Hardware 3 cars (i.e. the kind of car you get if you buy a new Tesla today) with lidar. Even if lidar is still super expensive at that point, robotaxi revenue can offset that (otherwise Waymo wouldn't have a business model in this scenario). To all the customers who own Hardware 3 cars, Tesla could offer a choice: a) Tesla buys back the car and deploys it as a robotaxi or b) Tesla retrofits the car with lidar and recoups the cost by taking a percentage of any robotaxi revenue that vehicle generates.

The good news for Tesla is that, in this scenario, it would be able to grow its robotaxi fleet quite quickly by both retrofitting existing cars with lidar and producing new ones with lidar using its existing manufacturing capacity.
 
Last edited:
... Most in the industry think it is on the verge of impossible (with any near term technology) to deliver on FSD with the sensor suite currently in Tesla cars.
There are a wide variety of opinions on this. There are plenty of people who say it can't be done anytime soon period. For example Lex Fridman. On the other side of the coin there are experts who say Tesla is going to win like Comma A.I.
Some day it may become possible to do that, but there is no sign of it being imminent.
One can claim the same B.S. from waymo. They've been saying for the past several years that it is coming next year.
If Tesla changes that, and invents the breakthrough necessary for that, they will indeed profit well. If they don't then they are a very distant finisher in the game. If somebody other than Tesla makes the breakthrough, it will revolve around whether Tesla can learn from that, or buy that. The seller will of course demand a high price.
Change every word from Tesla to Waymo or any other driverless car company and you got it right also.
 
How long do you think it would take Elon to catch up and overtake those other lidar companies?

Dunno. Maybe Tesla could do an acquisition (like with DeepScale) to speed things up. There are lots of small startups that have been working on autonomous vehicles with lidar for years. What Tesla would be buying is the lidar perception and/or sensor fusion software, rather than taking the time to develop that software itself.

If Waymo plainly and definitively cracks the robotaxi nut, it will have to figure out how to scale manufacturing of its (rumoured to be) $400,000 minivans. So, maybe this would give Tesla time to do an acquisition or develop its own lidar software.