Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
But Waymos have multiple LiDAR units, as well as radars and cameras, so that they should never hit a stationary object. So, the news is that even with all of those sensors Waymo still can't avoid hitting a power pole...
It is obviously not likely a hardware issue but a software bug or a missing event programing in the code. It is likely patched and this exact situation likely won't happen again at least in the same manor. No matter what RTs will make mistakes in the future. It is impossible to program and execute for every known, known unknowns and most especially unknowable unknowns in the dynamic world of driving. As long as no person is injured and property damage is minimal it is good that it happened now so Waymo can correct the code so this same incident isn't likely to happen again in the future.

A problem is the media's click bait fascination with every minor event that happens. [hypothetical future]Almost all cars are autonomous and all accidents are reduced by >95% and deaths/injuries were reduced by >95% there of course will still be some accidents. Would an autonomous car hitting a pole be news then? Of course not.
 
  • Like
Reactions: BrerBear and DanCar
My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll.
Fleet response shouldn't be able to cause this. It can just suggest a path, or from their demo video it looks like a desired waypoint to get around the issue/blockage, but the Waymo still has to decide if it can do that and execute it. And that looked like a wide open alley, why would it even need to ask for Fleet Response for help?
 
Fleet response shouldn't be able to cause this. It can just suggest a path, or from their demo video it looks like a desired waypoint to get around the issue/blockage, but the Waymo still has to decide if it can do that and execute it. And that looked like a wide open alley, why would it even need to ask for Fleet Response for help?
Yeah. I’m just saying it’s much more likely for there to be bug in fleet response mode than normal autonomous mode.
Figuring out where to stop to pick people up is something Waymo and Cruise have struggled with. It wouldn’t surprise me if it’s one of the most common causes of fleet response.
 
My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll.
I can't imagine they disable the "avoid pole" logic when using Fleet Response. Even when FR screws up the normal accident avoidance logic remains active. When FR told the car to run a red light in SF in January the car still stopped when it detected a moped crossing its path.

They’ve driven 7 million driverless miles without any other known collisions with fixed object so it seems unlikely it was in its normal operating mode.
They clipped a parked car in SF and have hit a few things in parking lots, e.g. automatic gates. They've also hit road debris. Damage was always minor and I just figured these were due to geometry or timing issues. But this was a full frontal assault on a giant telephone pole! It's a colossal screwup that IMHO calls everything they do into question.
 
My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll. They’ve driven 7 million driverless miles without any other known collisions with fixed object so it seems unlikely it was in its normal operating mode.
Remote assistance can't actually remote control the car (Waymo has made that point very clear including in the latest PR). The car is still responsible for making the driving decisions and any "don't hit stationary objects" logic should still be functioning.
 
Last edited:
Remote assistance can't actually remote control the car (Waymo has made that point very clear including in thr latest PR). The car is still responsible for making the driving decisions and any "don't hit stationary objects" logic should still be functioning.
Obviously it should still be functioning. Lots of ways to screw up programming. Maybe the transition between autonomous and remote assistance maneuver has a bug. Maybe remote assistance can give the car authority to hit stationary objects in some cases (low tree branch? Tumbleweed? Balloon?).
I hope Waymo lets us know. Very curious.
 
Obviously it should still be functioning. Lots of ways to screw up programming. Maybe the transition between autonomous and remote assistance maneuver has a bug.

That's the thing... there is no transition according to Waymo themselves (see below), so there's no scenario where this is remote support and not the "Waymo Driver" driving into a telephone pole.

Fleet response can influence the Waymo Driver's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider. The Waymo Driver evaluates the input from fleet response and independently remains in control of driving. This collaboration enhances the rider experience by efficiently guiding them to their destinations.

Another news story today about Waymo as well.

1 in 4 SF school crossing guards say they've had to dodge driverless cars to avoid being hit.

 
That's the thing... there is no transition according to Waymo themselves (see below), so there's no scenario where this is remote support and not the "Waymo Driver" driving into a telephone pole.
Yeah that’s obviously not true though. For example we know that remote assistance can run red lights.
However I’m not saying they intend to have remote assistance be able to run into a telephone pole. I’m saying I could see remote assistance code being way more likely to have bugs.
 
Yeah that’s obviously not true though. For example we know that remote assistance can run red lights.
However I’m not saying they intend to have remote assistance be able to run into a telephone pole. I’m saying I could see remote assistance code being way more likely to have bugs.

*sigh*

It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.

Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).

EDIT: To make the point clear, any "remote assistance code" would be on a different level than the "Waymo Driver". Otherwise it would be them directly controlling the car and count as a disengagement.

Think of it like a video game. You can direct your character where to go, but it has an underlying system that (typically) won't let you run through a wall. You are suggesting they go that direction, but the underlying code has the final say and says no.
 
Last edited:
*sigh*

It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.

Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).
They said the car ran the red light because of human error by remote assistance. Clearly remote assistance changes what the Waymo Driver is allowed to do.
 
  • Like
Reactions: Doggydogworld
That's the thing... there is no transition according to Waymo themselves (see below), so there's no scenario where this is remote support and not the "Waymo Driver" driving into a telephone pole.



Another news story today about Waymo as well.

1 in 4 SF school crossing guards say they've had to dodge driverless cars to avoid being hit.

Let it hit you then retire from the lawsuit Money.
 
But Waymos have multiple LiDAR units, as well as radars and cameras, so that they should never hit a stationary object. So, the news is that even with all of those sensors Waymo still can't avoid hitting a power pole...

Waymo has both advanced sensors and advanced software. The Waymo should not have hit that power pole. It is not normal that the Waymo hit that pole. So, the fact that it did hit the pole is baffling and implies something odd happened. I would also add that the Waymo had to move out of the lane in order to hit the pole. It is very unlikely for a Waymo to just leave the lane for no reason. From the news clip, it sounds like the two girls were waiting for the Waymo to pick them up. So it is likely the Waymo was attempting to pull over to the side for the pick-up. That would explain why the Waymo left the lane. But why it hit the pole is baffling. It could be a number of reasons from a perception error or a planning error or something else. Without more info, it is impossible to say for sure. So we are left with speculation. I wish Waymo would come forward and explain the incident so that we could know why the collision happened.
 
  • Like
Reactions: DanCar
Let it hit you then retire from the lawsuit Money.
Just back from a run this morning and low and behold I was hit by a taxi (my 3ed time hit by a car running, luckily all minor). Of course it was a HT and not a RT. He decided to turn right on red as I was crossing. I hit my "brakes" and started backpedaling and the taxi side view mirror clipped my hand. Was it news? No but had he run over and killed me would it be news? Not really and just another fatality on the streets. However had it been a RT/autonomous car that clipped my hand it would have made news if the media were told of it.

Of course it is highly unlikely that this would happen with a RT since it would see me.

So crossing guards that report that Waymos are nearly hitting them is very hyperbole and is probably more that the Wamoys don't respond the same as human drivers do and seem more "unpredictable". Plus these guards tend to be older and less tech savvy and many likely have a natural fear bias to high tech and seeing a car with NO driver is sorta shocking and scary to them.
 
It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.
Sometimes the PR department's understanding is incomplete. That's why Tesla is so awesome -- we get pure, unvarnished truth and airtight schedules straight from the horse's mouth!

Think of it like a video game. You can direct your character where to go, but it has an underlying system that (typically) won't let you run through a wall.
Great example. But some early video games had bugs which in some cases let you partly go through a wall or see through a wall or whatever. I see three possibilities here:

1. Waymo has gone NN-crazy with few or no guardrails and this was simply a NN hallucination
The least likely, IMHO. Waymo/Google are NN leaders and have long done E2E work, but they understand the limitations and have even pointed some out publicly. I don't see them adopting such a flaky approach in deployed cars.

2. Waymo asked Fleet Response "can I proceed through this object?" and FR mistakenly said yes
Also unlikely. As @Daniel in SD points out, FR may be able to tell the car to proceed over a trash bag or through a low hanging branch, but ramming a giant pole could only happen with a terrible system design AND a grossly incompetent (or malevolent) remote monitor.

3. A bug in their heuristic guardrail code caused it to ignore a clearly detected object
Such a bug could be triggered by FR, just as a video game player might trigger a "wall" bug by running parallel to the wall and turning suddenly or jumping just before hitting the wall or whatever. We'll never know unless Waymo tells us. Which they should. I don't expect driving perfection, but I do expect "don't run straight into huge poles" perfection. This is a "Day 1" issue, a Ten Commandments violation. Such a basic failure goes straight to their core and shakes my confidence in their overall system design.
 
  • Helpful
Reactions: diplomat33
*sigh*

It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.

Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).

It's not as binary as you want to make it.

Waymo's will not (shouldn't) run a red light, but it will if that is the "safest" action needed to get out of a situation. We know remote assistance can "override" certain actions Waymo wouldn't traditionally do. That's not to say they don't make mistakes on their own.

The most readily example I can think of is driving on private driveways where months ago you were adamant Waymo does not know driveways exist because "predefined" drivable space.

Traditionally, the Waymo Driver doesn’t traverse private driveways. Fleet response provides the Waymo Driver guidance to make even more room to efficiently clear the street and make way for the truck.

Remote assist can override that behavior and propose a path for the car to take to create space for oncoming traffic by driving onto a private driveway.

2q8xUje.gif
21MSxzV.gif


EDIT: To make the point clear, any "remote assistance code" would be on a different level than the "Waymo Driver". Otherwise it would be them directly controlling the car and count as a disengagement.

Think of it like a video game. You can direct your character where to go, but it has an underlying system that (typically) won't let you run through a wall. You are suggesting they go that direction, but the underlying code has the final say and says no.

Remote assists make suggestions. In the example above you can see that remote assist just drops a pin that tells the vehicle to go to this location, the car takes those suggestions and creates a trajectory using what it sees in the real world and what the planner thinks is the best way to achieve the desirable state. The car is always in control.