Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
00:00 Off we go once more!
00:00 Same weird dead end alley but in reverse
01:05 Very very impressive unprotected left
03:57 Nice cones demo
06:10 Teeny bit of reactive braking for truck that jumps out of the opposite lane
11:05 Protected left
13:15 Squeezing past a truck, shifting in the lane
16:25 Unprotected left
17:35 Stop sign unprotected left

01:05 is a lot like Chuck's unprotected left turn scenario. We see Waymo handle it great!
Yup thats like chuck's unprotected left turn.
 
  • Like
Reactions: diplomat33
Probably not since it is outside the scope of the ODD. It requires a lot or prep work in sewing up and assigning a new ODD.

Nope. In fact, Dolgov has repeatedly pointed out that the Waymo Driver generalizes well to new cities right from the "get-go", implying that it does not require a lot of "prep work":



So there is not a lot of prep work simply to drive autonomously in a new ODD. There is a lot of prep work required but it is for launching a commercial service, not simply driving autonomously. That is an important distinction IMO. There is a big difference between simply driving autonomously and launching a viable commercial service. A distinction, I would argue, Cruise is just now learning the hard way.

And Dolgov showed a clip of autonomous driving on highways back in Jan 2023 so we know they've doing highway driving for at least 8 months (just not open to the general public yet):

 
Last edited:
Nope. In fact, Dolgov has repeatedly pointed out that the Waymo Driver generalizes well to new cities right from the "get-go", implying that it does not require a lot of "prep work":

You're both largely right.

Waymo is generalized and probably 99% of everything will work fine out of the box, however, there are also going to be some very local laws or quirks which require a bit of additional time investment by staff.

At very least things written in state or municipal regulations need to be configured. For example, does the region allow left-on-red to a one-way street? Is a U-turn allowed by default or restricted by default? What is the default speed limit when not otherwise marked? When are school zones restricted (the date/time ranges change by region). What lanes are allowed to travel when a school bus is stopped?

It may not take Waymo long to introduce a new region, but it's not a zero-effort process and deploying without configuring for local laws would be wrong.

Some places will also have physical quirks that Waymo has yet to encounter. For example, an LA specific quirk requiring additional testing is the 6-point all-stop intersection in Beverly Hills. Can Waymo driver correctly determine right-of-way? Probably, but it needs to be confirmed before deploying an unmanned vehicle. After that quirk is trained they'll be able to handle any 6-point all-stop intersection in the USA but I don't think there are any others.


It's not zero-effort for Tesla either but ultimately it's not Tesla's fault when one of their FSD vehicles breaks the law. The person behind the wheel is supposed to be watching for an prevent that type of mistake. This difference in responsibility allows for a more aggressive roll-out.

A Toronto specific quirk that Telsa FSD was handling very illegally (before being disabled in downtown Toronto for a couple years until the system was fixed) was related to the street-running trams. You cannot pass on the right of a stopped tram as people disembark to the lane you're about to drive through. Today's Waymo driver will almost certainly require additional training to handle this correctly.
 
Last edited:
You're both largely right.

Waymo is generalized and probably 99% of everything will work fine out of the box, however, there are also going to be some very local laws or quirks which require a bit of additional time investment by staff.

At very least things written in state or municipal regulations need to be configured. For example, does the region allow left-on-red to a one-way street? Is a U-turn allowed by default or restricted by default? What is the default speed limit when not otherwise marked? When are school zones restricted (the date/time ranges change by region). What lanes are allowed to travel when a school bus is stopped?

It may not take Waymo long to introduce a new region, but it's not a zero-effort process and deploying without configuring for local laws would be wrong.

Some places will also have physical quirks that Waymo has yet to encounter. For example, an LA specific quirk requiring additional testing is the 6-point all-stop intersection in Beverly Hills. Can Waymo driver correctly determine right-of-way? Probably, but it needs to be confirmed before deploying an unmanned vehicle. After that quirk is trained they'll be able to handle any 6-point all-stop intersection in the USA but I don't think there are any others.


It's not zero-effort for Tesla either but ultimately it's not Tesla's fault when one of their FSD vehicles breaks the law. The person behind the wheel is supposed to be watching for an prevent that type of mistake. This difference in responsibility allows for a more aggressive roll-out.

A Toronto specific quirk that Telsa FSD was handling very illegally (before being disabled in downtown Toronto for a couple years until the system was fixed) was related to the street-running trams. You cannot pass on the right of a stopped tram as people disembark to the lane you're about to drive through. Today's Waymo driver will almost certainly require additional training to handle this correctly.

Thanks. I think we are on the same page.

It occurred to me that the amount of prep work probably depends on what Waymo is trying to accomplish:
1) If the goal is just collecting driving data, they don't need to do any prep work. Waymo can deploy a few cars and drive around manually for a few weeks, like they did in NYC. And I believe if Waymo just wants to see how the Waymo Driver would work, they could deploy it anywhere, with no maps, with a safety and see how it drives. That would not require any prep work.
2) If the goal is real testing, they would need to do a bit more prep work. They would need safety drivers, a place to store and maintain their test cars. They would need to create HD maps of the area.
3) If the goal is validation before commercial deployment, they would need to do a lot more prep work. As you pointed they would need to account for local rules and quirks, work with local city officials on any special pick up or drop off location, get commercial permits like with the CPUC, do rigorous testing to validate safety, do some early access testing with riders to get feedback etc...
 
Last edited:
  • Like
Reactions: rbt123
You're both largely right.

Waymo is generalized and probably 99% of everything will work fine out of the box, however, there are also going to be some very local laws or quirks which require a bit of additional time investment by staff.

At very least things written in state or municipal regulations need to be configured. For example, does the region allow left-on-red to a one-way street? Is a U-turn allowed by default or restricted by default? What is the default speed limit when not otherwise marked? When are school zones restricted (the date/time ranges change by region). What lanes are allowed to travel when a school bus is stopped?

It may not take Waymo long to introduce a new region, but it's not a zero-effort process and deploying without configuring for local laws would be wrong.

Some places will also have physical quirks that Waymo has yet to encounter. For example, an LA specific quirk requiring additional testing is the 6-point all-stop intersection in Beverly Hills. Can Waymo driver correctly determine right-of-way? Probably, but it needs to be confirmed before deploying an unmanned vehicle. After that quirk is trained they'll be able to handle any 6-point all-stop intersection in the USA but I don't think there are any others.


It's not zero-effort for Tesla either but ultimately it's not Tesla's fault when one of their FSD vehicles breaks the law. The person behind the wheel is supposed to be watching for an prevent that type of mistake. This difference in responsibility allows for a more aggressive roll-out.

A Toronto specific quirk that Telsa FSD was handling very illegally (before being disabled in downtown Toronto for a couple years until the system was fixed) was related to the street-running trams. You cannot pass on the right of a stopped tram as people disembark to the lane you're about to drive through. Today's Waymo driver will almost certainly require additional training to handle this correctly.

None of the above should require any more _training_. It should just require rules.
Streetcars and all-ways stops are common.
 
  • Like
Reactions: Doggydogworld
I think the confusion is that some people think Waymo is fit for a city, and not general driving task trained AI. Waymo's AI is trained, kinda like how Tesla FSD Beta operates, for general driving. Each city they operate in trains the system more for new, and varied, driving tasks. But the entire fleet benefits from those new driving experiences.

This is why Waymo can scale quickly as they expand operations, because the fleet moving into a new city already has the benefit of experience in all the previous cities.
 
None of the above should require any more _training_. It should just require rules.
Streetcars and all-ways stops are common.

4-way and 3-way stops are common. 6-way stops are exceptionally rare. This will require a fair amount of testing to confirm generic training applies it's knowledge correctly; that it hasn't over-fit. Creating and implementing a new test in their virtual environment takes some manual effort. They'll do quite a bit of real-world testing too.

Streetcars are in San Fran but how they handle stops/stations is different due to the legacy structure of their system. That difference will require new training. The external signal pattern on the vehicle indicating a stop (station) vs stop (in traffic) is also very different in a modern streetcar from older ones, and this will also require training.


Even trivial additions, configurations, and changes need to be detected and implemented. Quality Assurance testing prior to expanding the operations area is going to be a rate limit to the deployment of any Level 4 system. QA cannot be skipped as they often won't know what challenges an area presents until they experience it: the hazard of machine learning is blind spots can be difficult to discover.

With Level 2 systems, this detailed QA is largely performed by the human driver.
 
Last edited:
  • Like
Reactions: diplomat33
The New York Times dispatched three reporters across the city to test Waymo’s robot taxis. I started in Alamo Square, home to the famous Painted Ladies houses. Yiwen Lu started her ride at the Marina Green, along San Francisco’s northern waterfront, and Mike Isaac started his ride near the historic Mission Dolores Basilica.

Each of us waited five to 10 minutes for a ride.

My ride was so smooth, the novelty began to wear off, turning a trip to the future into just another journey across town. The car was precise and deliberate, albeit without the flexibility or interactions you would have with a human driver. It paused for pedestrians and yielded to emergency vehicles.

Like my ride, Yiwen’s trip was downright sleepy. The car was dryly precise. It never exceeded the speed limit, used its turn signal well in advance of a lane change and yielded to pedestrians in crosswalks whom speedy drivers might disregard.

Mike’s robot taxi, however, was more aggressive. It jumped off the starting line with more acceleration than he had expected. He was mystified by the way the car zipped through several tightly packed neighborhoods before settling into the drive to the beach.

When my Waymo approached a construction project blocking the right lane, it slowed to 20 miles per hour from 30 m.p.h. and flipped on its turn signal to pull into the left lane. Moments later, the car was at a stop sign as a fire engine approached with flashing lights. The Waymo hesitated. A touch screen showed a brief explanation: “Yielding to emergency vehicle.” It waited until the fire truck passed to accelerate through the intersection.

Yiwen’s ride began with a complication: an accident, not involving the Waymo, next to a parking lot at Marina Green. Police cars were blocking part of the roadway, so the Waymo car quickly changed its route. Instead of taking the main street, the Waymo car drove onto a nearby residential street and went around the accident.

The cars were all quick to respond to pedestrians. My ride patiently waited at intersections and crosswalks as people walked their dogs, sipped coffee and rode their bikes toward Golden Gate Park.

But at the top of a hill, Mike’s car recognized a man crossing the road in a crosswalk but kept creeping forward slowly while it waited for him to get to the other side. The pedestrian gave the car — and Mike — an annoyed look.

Instead of taking the most direct route to the beach down a congested street, my Waymo crossed Golden Gate Park and drove down a less congested street, but that added a few minutes to the journey. It puttered most of the way at 29 m.p.h. — one mile per hour under the speed limit — and deferred to other drivers. At one point, it sat for a few minutes behind a car waiting to turn left rather than merging into the right lane to go around that vehicle.

My Waymo pulled into a parking lot six minutes later than it had initially predicted. It glided through the parking lot to a small, empty space where the map on the touch screen showed a circle. Once it pulled into the circle, it stopped.

Yiwen’s car was less direct. At the beginning of her journey, it told her that there would be a two-minute walk to the restaurant from her drop-off point. The car reminded her of that as it arrived and encouraged her to use the app to guide her as she walked to the Beach Chalet.

The Waymo rides were affordable, ranging from $18 to $21, about the same as an Uber.

 
  • Informative
Reactions: Doggydogworld
Instead of taking the most direct route to the beach down a congested street, my Waymo crossed Golden Gate Park and drove down a less congested street, but that added a few minutes to the journey. It puttered most of the way at 29 m.p.h. — one mile per hour under the speed limit — and deferred to other drivers. At one point, it sat for a few minutes behind a car waiting to turn left rather than merging into the right lane to go around that vehicle.
Some Tesla members on TMC: 29MPH in a 30MPH zone? That's dangerous and idiotic. It's going to cause accidents, and should be pulled off the road immediately!! Most drivers in the area drive 40+MPH, and Waymo should too! 🤦‍♂️
 
Companies like Waymo try to avoid using hard coded rules, they do all the autonomous driving with machine learning. ML requires training.
Traffic laws are perfect for hard coding, though. ML is imprecise and computationally inefficient. Ideally you've have simple, drop-in traffic law modules for each location. No right on red in Manhattan (and various other places) is literally a couple lines of code, and that code will be 100% reliable. Why maintain a separate neural network for places that don't allow right on red and spend all that time gathering thousands of "don't turn on red" scenarios to train that separate neural net? And still run the risk it might screw up?

The trick is melding the hard-coded traffic law module with generic NNs that handle the bulk of the driving task. No issue for perception and not a big issue for planning, but could be tricky for prediction.

It's stuff like this that make me extremely suspicious of end-to-end claims. Including Musk's.
 
From all accounts, Waymo seems to have safe, reliable generalized autonomous driving. I especially like that the Waymo Driver seems to be getting pretty smart about handling road closures, accidents, first responders and construction zones. That's key to be able to scale because to be truly autonomous, AVs need to be able to handle unexpected changes without relying on remote assistance all the time.

But according to the New York Times article, the fully geared out I-Pace costs as much as $200k. It seems like it will be very hard to be profitable with robotaxis that cost that much. So I doubt that Waymo will be able to truly scale the I-Pace. There is also the wrinkle that Jaguar has discontinued the I-Pace and the new president of Alphabet is cracking down on spending. this leads me to my question: what is Waymo's endgame?

My guess is that the I-Pace is not intended for mass scale, it is intended to be a test and "demo" vehicle. By that I mean, it let's Waymo test and develop the autonomous driving and also demo to the public what driverless ride-hailing feels like. And these ride-hailing services with the I-Pace also allow Waymo to test the business model and gain experience about managing a ride-hailing service, like customer service, communicating with first responders. So there is great value in the I-Pace. But I think Waymo's plan is to develop the Waymo Driver to the point where it is generalized enough and then put it on cheaper vehicles like the Geely minivan. So I think their endgame is to eventually put the Waymo Driver on cheaper vehicles that can be profitable as robotaxis, And once the Waymo Driver is able to go on cheaper vehicles, then it can go on consumer cars too.