Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Some smart driving and some dumb rerouting and drop off:


Timestamps:
00:00 Car pulling up
00:47 Ride start
03:47 Light rail vis
08:27 Trailer blocking lane
09:05 Goes around trailer (smart)
15:10 Rerouting detour
15:40 Unprotected right with the sun blocking view (smart)
16:05 Construction zone unexpected detour (stupid)
16:35 Five point turn
17:50 Straight crossing four lanes of traffic (smart)
18:35 Odd... routing and dropoff (stupid)

We see the Waymo stopped and waited a bit before going around the trailer. It seems Waymo is trained for more cautious behavior. I wonder if that extra cautious behavior is a response to the towed truck accident.
 
Last edited:
  • Informative
Reactions: Doggydogworld
WSJ has article about Waymo testing on highways. The article is behind a paywall but here is one relevant bit:

The rollout on freeways in Phoenix has largely been without trouble so far, Waymo says. The cars have been able to successfully navigate on and off ramps and haven’t come to a halt in the middle of the freeway—another issue that occurred on city streets. When the cars got confused or encountered a problem, they would sometimes stop where they were on the road, backing up traffic and frustrating other drivers who couldn’t communicate with the car. Sugandha Sangal, a product manager at Waymo, says they have built-in redundancies to prevent that from happening on fast-moving freeways. If one sensor system fails, there is another in place to help.
Over the past year, Waymo vehicles in testing have been involved in a handful of incidents on freeways. Last March, one of the cars in San Francisco hit some tire scraps while transitioning from Interstate 280 to Interstate 380, causing damage to the car, according to state records. In another incident the same day, a Waymo was driving itself on Bayshore Boulevard in San Francisco when a safety driver—someone who sits in the driver’s seat of the car and can take over in the case of an emergency—crashed into another car that was stopped in the rightmost lane. Both vehicles sustained damage.


Good to see that Waymo is not stopping in the middle of a freeway when there is an issue.
 
We see the Waymo stopped and waited a bit before going around the trailer. It seems Waymo is trained for more cautious behavior. I wonder if that extra cautious behavior is a response to the towed truck accident.
I suspect it asked a remote monitor if it was OK to go around. My theory is they only put a message on the screen when the car is truly stuck, not when simply asking for help choosing between two valid alternatives.

WSJ has article about Waymo testing on highways. The article is behind a paywall but here is one relevant bit:

"Waymo was driving itself on Bayshore Boulevard in San Francisco when a safety driver—....—crashed into another car that was stopped in the rightmost lane."
Terrible wording. If the car was driving itself then it wasn't the safety driver who crashed. Even if the safety driver failed to prevent the crash, it was still the car that crashed. No Tesla-style blame shifting allowed!!! On the other hand, if the car was in manual mode then the human truly was at fault. But in that case he was just a driver, not a "safety driver" and the car was not "driving itself".
 
  • Like
Reactions: flutas
Terrible wording. If the car was driving itself then it wasn't the safety driver who crashed. Even if the safety driver failed to prevent the crash, it was still the car that crashed. No Tesla-style blame shifting allowed!!! On the other hand, if the car was in manual mode then the human truly was at fault. But in that case he was just a driver, not a "safety driver" and the car was not "driving itself".

I bet test phase 'self-driving' software is different than normal self driving software given a human needs to quickly take over versus normal self driving software when a customer might try to grab the controls for fun or destruction.

Safety drivers are on the hook though. We can recall the Uber safety driver that pleaded guilty after killing a Phoenix pedestrian.
 
I suspect it asked a remote monitor if it was OK to go around. My theory is they only put a message on the screen when the car is truly stuck, not when simply asking for help choosing between two valid alternatives.

It is also possible that the Waymo Driver stopped out of cautious and figured it out on its own. Waymo has said that in most cases where the Waymo Driver calls remote assistance, it figures it out on its own.

Terrible wording. If the car was driving itself then it wasn't the safety driver who crashed. Even if the safety driver failed to prevent the crash, it was still the car that crashed. No Tesla-style blame shifting allowed!!! On the other hand, if the car was in manual mode then the human truly was at fault. But in that case he was just a driver, not a "safety driver" and the car was not "driving itself".

Agreed on the bad wording. It could be that the Waymo was in autonomous mode before the crash. The safety driver took over but failed to prevent the crash.
 
Some interesting scenarios. At the 11:06 mn mark, we see a pedestrian trigger the "disturbance detected" alert.


0:00 Continued from previous episode…
1:58 Partially entering turn lane
5:48 Stuck in crosswalk due to blocked intersection
9:23 Nudging into adjacent lane for nearby pedestrian
10:31 Tourists admiring Waymo
11:06 Disturbance detected outside message
11:34 Nudging into adjacent lane for nearby pedestrians
11:49 Rider support calls me
12:05 Tight squeeze to enter turn lane
13:35 Unprotected left turn with traffic
14:20 Yielding to cut-in vehicle
14:31 HD map curb drawn too large
14:46 Right on red
17:02 Pull over
 
Anecdote on Reddit of someone interfering with a Waymo:

At a turning lane Waymo was stopped, as soon as it turned green, this guy throws his backpack in front of Waymo making it stop, then starts taunting the car with another bag. In the end, I rolled up my window and Waymo swerved around it after the next light, since we missed the first one. I’m just surprised support didn’t call, I didn’t think I needed to call them after we left, but kudos to Waymo for trying not to hit any obstacle, felt helpless.

 
  • Like
Reactions: DanCar
Terrible wording. If the car was driving itself then it wasn't the safety driver who crashed. Even if the safety driver failed to prevent the crash, it was still the car that crashed. No Tesla-style blame shifting allowed!!! On the other hand, if the car was in manual mode then the human truly was at fault. But in that case he was just a driver, not a "safety driver" and the car was not "driving itself".

It could be that the Waymo was in autonomous mode before the crash. The safety driver took over but failed to prevent the crash.

I was correct. Here is the collision report with the DMV. The Waymo was in autonomous mode. The safety driver transitioned to manual mode and crashed into the other vehicle:

"On March 16, 2023 at 5:10 PM PST a Waymo Autonomous Vehicle (“Waymo AV”) operating in San Francisco, California was in a collision involving a passenger car on Bayshore Boulevard between Flower Street and Cortland Avenue.

The Waymo AV was traveling north on Bayshore Boulevard in autonomous mode, merging into the rightmost lane. As the Waymo AV merged, the test driver transitioned from autonomous mode to manual mode and began accelerating. The front of the Waymo AV made contact with the rear of a passenger car that was stopped in the rightmost lane. At the time of the impact, the Waymo AV’s Level 4 ADS was not engaged and a test driver was operating the Waymo AV in manual mode. Both the Waymo AV and the passenger car sustained damage."

Source: https://www.dmv.ca.gov/portal/file/waymo_031623_2-pdf/

That was entirely the fault of the safety driver.

 
  • Informative
Reactions: Doggydogworld
I was correct. Here is the collision report with the DMV. The Waymo was in autonomous mode. The safety driver transitioned to manual mode and crashed into the other vehicle:

"On March 16, 2023 at 5:10 PM PST a Waymo Autonomous Vehicle (“Waymo AV”) operating in San Francisco, California was in a collision involving a passenger car on Bayshore Boulevard between Flower Street and Cortland Avenue.

The Waymo AV was traveling north on Bayshore Boulevard in autonomous mode, merging into the rightmost lane. As the Waymo AV merged, the test driver transitioned from autonomous mode to manual mode and began accelerating. The front of the Waymo AV made contact with the rear of a passenger car that was stopped in the rightmost lane. At the time of the impact, the Waymo AV’s Level 4 ADS was not engaged and a test driver was operating the Waymo AV in manual mode. Both the Waymo AV and the passenger car sustained damage."

Source: https://www.dmv.ca.gov/portal/file/waymo_031623_2-pdf/

That was entirely the fault of the safety driver.
Or, the car was going to crash and the driver took over, but messed up and failed to prevent the crash, so it was the fault of the safety driver, but not entirely.
 
  • Like
Reactions: flutas
Or, the car was going to crash and the driver took over, but messed up and failed to prevent the crash, so it was the fault of the safety driver, but not entirely.
I’ve inexplicably hit the accelerator on a couple disengagements of FSD. It can be pretty easy to do on a rapid takeover. Not saying it’s excusable. It’s part of the learning curve of being a safety driver. Practicing “panic mode” (the reason for both of my accelerator-pushing disengagements) disengagements is important.
 
Or, the car was going to crash and the driver took over, but messed up and failed to prevent the crash.

The report says that the car began to accelerate after it was in manual driving mode and then rear ended the car in the right lane. So maybe the safety driver was unsure about the merge and decided to take over but messed up by accelerating too much and hitting the car in front. We don't know if the Waymo would have crashed in autonomous mode but it definitely sounds like human error to me. Regardless, the car was in manual driving mode when the crash happened so the safety driver was responsible.

When the Car reroutes for no reason and takes a longer route does the customer have to pay for that longer mileage route?

No.
 
If the safety driver just got impatient with a slow lane change then it's entirely his fault. But if he took over and accelerated because Waymo was making an unsafe lane change and failing to yield to someone coming up from behind, then the blame should be shared.

Obviously, we don't know if the Waymo was going to make an unsafe maneuver. So we don't know if the disengagement was warranted or not. That matters from a safety point of view, in terms of whether Waymo needs to improve the software. But in terms of liability, what matters is who was driving at the time of the collision. The car was in manual mode, so the safety driver was driving. So the safety driver was responsible for the collison. If the car had stayed in autonomous mode when it collided with the other car, then the Waymo would have been driving at the time of the collision and the Waymo would be responsible.
 
But in terms of liability, what matters is who was driving at the time of the collision. The car was in manual mode, so the safety driver was driving. So the safety driver was responsible for the collison. If the car had stayed in autonomous mode when it collided with the other car, then the Waymo would have been driving at the time of the collision and the Waymo would be responsible.
That kinda feels like "Yeah FSDb was active, but we dumped them out (5,10,30s) before the collision so we're not responsible" hand wavey imo (and yeah blah blah FSD is level 2 and only an ADAS, was just a comparison for the logic).

Like you said, we don't actually know the scenario, I could picture one where a collision was highly probable no matter what due to Waymo merging incorrectly and the driver took over to try and save it (but failed). I could also picture one where the test driver was just driving to gather data and hit someone.

I think it falls more to the why the collision happened rather than the exact pinpoint yes/no most would want, and at least in this case we don't know why.
 
WSJ has article about Waymo testing on highways. The article is behind a paywall but here is one relevant bit:





Good to see that Waymo is not stopping in the middle of a freeway when there is an issue.
I've not read it but WSJ does seem to allow unlimited (?) gift links. I subscribe right now, so https://www.wsj.com/tech/waymo-self...wsrapumix8x&reflink=desktopwebshare_permalink should work for those who don't subscribe.
 
  • Like
Reactions: Jeff N