You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
It is obviously not likely a hardware issue but a software bug or a missing event programing in the code. It is likely patched and this exact situation likely won't happen again at least in the same manor. No matter what RTs will make mistakes in the future. It is impossible to program and execute for every known, known unknowns and most especially unknowable unknowns in the dynamic world of driving. As long as no person is injured and property damage is minimal it is good that it happened now so Waymo can correct the code so this same incident isn't likely to happen again in the future.But Waymos have multiple LiDAR units, as well as radars and cameras, so that they should never hit a stationary object. So, the news is that even with all of those sensors Waymo still can't avoid hitting a power pole...
Fleet response shouldn't be able to cause this. It can just suggest a path, or from their demo video it looks like a desired waypoint to get around the issue/blockage, but the Waymo still has to decide if it can do that and execute it. And that looked like a wide open alley, why would it even need to ask for Fleet Response for help?My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll.
Yeah. I’m just saying it’s much more likely for there to be bug in fleet response mode than normal autonomous mode.Fleet response shouldn't be able to cause this. It can just suggest a path, or from their demo video it looks like a desired waypoint to get around the issue/blockage, but the Waymo still has to decide if it can do that and execute it. And that looked like a wide open alley, why would it even need to ask for Fleet Response for help?
I can't imagine they disable the "avoid pole" logic when using Fleet Response. Even when FR screws up the normal accident avoidance logic remains active. When FR told the car to run a red light in SF in January the car still stopped when it detected a moped crossing its path.My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll.
They clipped a parked car in SF and have hit a few things in parking lots, e.g. automatic gates. They've also hit road debris. Damage was always minor and I just figured these were due to geometry or timing issues. But this was a full frontal assault on a giant telephone pole! It's a colossal screwup that IMHO calls everything they do into question.They’ve driven 7 million driverless miles without any other known collisions with fixed object so it seems unlikely it was in its normal operating mode.
Remote assistance can't actually remote control the car (Waymo has made that point very clear including in the latest PR). The car is still responsible for making the driving decisions and any "don't hit stationary objects" logic should still be functioning.My guess is the Waymo was using remote assistance and there is a bug that allowed it hit the poll. They’ve driven 7 million driverless miles without any other known collisions with fixed object so it seems unlikely it was in its normal operating mode.
Obviously it should still be functioning. Lots of ways to screw up programming. Maybe the transition between autonomous and remote assistance maneuver has a bug. Maybe remote assistance can give the car authority to hit stationary objects in some cases (low tree branch? Tumbleweed? Balloon?).Remote assistance can't actually remote control the car (Waymo has made that point very clear including in thr latest PR). The car is still responsible for making the driving decisions and any "don't hit stationary objects" logic should still be functioning.
Obviously it should still be functioning. Lots of ways to screw up programming. Maybe the transition between autonomous and remote assistance maneuver has a bug.
Fleet response can influence the Waymo Driver's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider. The Waymo Driver evaluates the input from fleet response and independently remains in control of driving. This collaboration enhances the rider experience by efficiently guiding them to their destinations.
Yeah that’s obviously not true though. For example we know that remote assistance can run red lights.That's the thing... there is no transition according to Waymo themselves (see below), so there's no scenario where this is remote support and not the "Waymo Driver" driving into a telephone pole.
Yeah that’s obviously not true though. For example we know that remote assistance can run red lights.
However I’m not saying they intend to have remote assistance be able to run into a telephone pole. I’m saying I could see remote assistance code being way more likely to have bugs.
They said the car ran the red light because of human error by remote assistance. Clearly remote assistance changes what the Waymo Driver is allowed to do.*sigh*
It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.
Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).
Let it hit you then retire from the lawsuit Money.That's the thing... there is no transition according to Waymo themselves (see below), so there's no scenario where this is remote support and not the "Waymo Driver" driving into a telephone pole.
Another news story today about Waymo as well.
1 in 4 SF school crossing guards say they've had to dodge driverless cars to avoid being hit.
School crossing guards say they've had to dodge driverless cars to avoid being hit
The NBC Bay Area Investigative Unit spoke to 30 school crossing guards stationed at more than 20 schools across San Francisco and found nearly one in four said they experienced a “close call” in the crosswalk with an autonomous vehicle.www.nbcbayarea.com
Many crossing guards in the Bay Area are volunteers and may be at retirement age already anyways, so they aren't really in it for the money. Plus there is no guarantee if you get hit you necessarily will still live or be able bodied.Let it hit you then retire from the lawsuit Money.
But Waymos have multiple LiDAR units, as well as radars and cameras, so that they should never hit a stationary object. So, the news is that even with all of those sensors Waymo still can't avoid hitting a power pole...
Just back from a run this morning and low and behold I was hit by a taxi (my 3ed time hit by a car running, luckily all minor). Of course it was a HT and not a RT. He decided to turn right on red as I was crossing. I hit my "brakes" and started backpedaling and the taxi side view mirror clipped my hand. Was it news? No but had he run over and killed me would it be news? Not really and just another fatality on the streets. However had it been a RT/autonomous car that clipped my hand it would have made news if the media were told of it.Let it hit you then retire from the lawsuit Money.
Sometimes the PR department's understanding is incomplete. That's why Tesla is so awesome -- we get pure, unvarnished truth and airtight schedules straight from the horse's mouth!It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.
Great example. But some early video games had bugs which in some cases let you partly go through a wall or see through a wall or whatever. I see three possibilities here:Think of it like a video game. You can direct your character where to go, but it has an underlying system that (typically) won't let you run through a wall.
*sigh*
It is 100% true, or Waymo is directly lying in their blog post. Can't have it both ways.
Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).
EDIT: To make the point clear, any "remote assistance code" would be on a different level than the "Waymo Driver". Otherwise it would be them directly controlling the car and count as a disengagement.
Think of it like a video game. You can direct your character where to go, but it has an underlying system that (typically) won't let you run through a wall. You are suggesting they go that direction, but the underlying code has the final say and says no.
It even appears it was even courteous enough to turn on the right signal.An example of the Waymo Driver taking evasive action to avoid a collision: