Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
  • Informative
Reactions: mark95476
Does Waymo have an issue with hiring good safety drivers? Or maybe it's difficult to transition from monitoring an autonomous system for several hours to driving manually?

Statement from Waymo PR about another incident where a vehicle was being driven manually:
Another ? I didn't know about the others ...

I can understand accidents - but hitting a pedestrian - esp. when driven manually?
 
Does Waymo have an issue with hiring good safety drivers? Or maybe it's difficult to transition from monitoring an autonomous system for several hours to driving manually?

Statement from Waymo PR about another incident where a vehicle was being driven manually:


I suspect the latter. Making that transition from monitoring autonomous driving for hours to driving manually can be difficult, especially if the autonomous driving worked really well and did not require any interventions for hours and hours. It requires two very different skill sets. In one case, your focus was not on driving but on checking the systems and checking for any possible issues. That's very different from manual driving where you have to monitor the road and other users and do the driving tasks yourself again. And the safety driver may just be a little tired from monitoring the car for so long or maybe the safety driver lost focused a bit and did not see the pedestrian.

I am sure Waymo will do a reconstruction of the accident in their simulations. So Waymo will have data to show if the Waymo Driver could have prevented the accident or not. Also, we need to know why the safety driver was driving manually. Was it following a safety disengagement? Was it because of a situation that the safety driver thought the Waymo Driver could not handle? If the Waymo Driver can handle it, then I don't think the safety driver has any business driving manually.

But I think it begs the question: at what point is the autonomous driving good enough that having humans drive manually is less safe? I mean, if the data shows that the Waymo Driver would have prevented the accident then perhaps, having safety drivers driving manually, is making the cars less safe, and Waymo should remove the safety drivers, at least in the areas where the Waymo Driver is very experienced. It does not make sense to me to tout that your autonomous driving is so safe and then get into accidents because you are not using said autonomous driving.

After all, we keep talking about how autonomous driving will be better than human driving because autonomous cars won't get tired or distracted. So autonomous driving will prevent accidents caused by humans. If humans are driving your autonomous car and causing accidents that the autonomous driving would have prevented, it would seem that the answer should be to remove the humans from driving. I would ask the question, why are the safety drivers driving manually at all since it is a L4 car?

Now, if the Waymo Driver is not safer than human driving and Waymo really needs safety drivers to prevent accidents, fine. Then, I think they should revise their safety driver procedures. Maybe reduce the number of hours that safety drivers have to monitor the car to prevent fatigue or change the procedure for when they can transition to manual driving to make it easier.
 
  • Like
Reactions: willow_hiller
Does Waymo have an issue with hiring good safety drivers? Or maybe it's difficult to transition from monitoring an autonomous system for several hours to driving manually?

Statement from Waymo PR about another incident where a vehicle was being driven manually:

Waymo claims to be driving 100,000 miles a week in San Francisco so frequent collisions are not surprising. It's not clear how much they drive in manual mode. It could be that they're driving a majority of miles in manual mode to collect data and using autonomous mode in conditions will lower collision probability.
 
Also, we need to know why the safety driver was driving manually. Was it following a safety disengagement?
I think this is the main thing. If the driver disengaged and took over because of a developing unsafe situation - I can understand if they didn't have enough time to prevent an accident. Given how infrequent the disengagements are, its going to be a problem for Waymo.

If the driver was driving manually for sometime, then its just another manual driving accident. Like it happens all the time (though hitting pedestrians is relatively rare).
 
  • Like
Reactions: diplomat33
Another ? I didn't know about the others ...
They hit a guy on a motorized stand-up scooter in June. Also in manual mode. Not sure for how long, but it wasn't a last minute takeover.
But I think it begs the question: at what point is the autonomous driving good enough that having humans drive manually is less safe?
Their at-fault crashes all seem to be in manual mode (suspiciously so?). I remember a couple years ago a safety driver accidentally disengaged autonomous mode in his sleep and ended up in the median. Another safety driver intervened because the van was moved to the left for "no reason". Ends up it was giving space to a lane-splitting motorcyclist coming up from behind. The safety driver moved the van back into the biker's path causing him to hit the rear fender (legally maybe not Waymo's fault, but a clear dick move). I imagine their s/w would have avoided both of this year's pedestrian hits in San Francisco, too. Of course it might have caused some other problem.
 
  • Helpful
Reactions: willow_hiller
I think this is the main thing. If the driver disengaged and took over because of a developing unsafe situation - I can understand if they didn't have enough time to prevent an accident. Given how infrequent the disengagements are, its going to be a problem for Waymo.

If the driver was driving manually for sometime, then its just another manual driving accident. Like it happens all the time (though hitting pedestrians is relatively rare).
“We can confirm the vehicle was being driven in manual for the entirety of its mission.”
 
Waymo has quite a lot going on at CES 2022:


 
Waymo posted a new blog with details on their autonomous trucks and the improvements made to the 5th Gen hardware and software:

Read here: Waypoint - The official Waymo blog: Designed to deliver: Bringing the benefits of our 5th generation hardware to trucking

A few tidbits I found interesting:

Waymo's camera vision can classify objects up to 1km away:

So, using our cutting-edge perception, on top of the data collected by our high definition long-range cameras, we have trained the Driver to perceive and classify objects, such as a vehicle, up to 1,000 meters away.

5th Gen also has thermal cameras:

To help detect vulnerable road users at night and in inclement weather conditions, we’ve included thermal cameras in our vision system.

5th Gen lidar has higher resolution and can be used to detect weather conditions to know when to clean sensors:

Our next-gen lidar also boasts a significant increase in resolution, allowing our Driver to see the world in incredible detail. Not only does this high-fidelity data help the Driver identify objects at greater distances, but it also provides the Waymo Driver a deeper understanding of the weather conditions it's navigating, a challenge on long-haul routes—like the density of the fog or the precipitation level of rain—to know when and how frequently to clean the sensors and how much to adjust its speed.

HD radar can detect static and moving objects over 500 meters away:

Waymo’s state-of-the-art imaging radar boasts unparalleled range and resolution compared to other radars commercially available today. With it, we can detect and track many targets at over 500 meters, providing our technology a higher confidence and ability to reason about and detect both static and moving objects.

As an RF sensor, radar is not affected by background light from over-exposure from a low-angle sun or under-exposure at night, and can directly measure velocity in a single frame whereas other sensors typically infer velocity using multiple frames. Additionally, the sensors’ signal processing technique to see simultaneously in different directions makes it mechanically simple, inexpensive, and reliable.
 
So they have wipers for the Lidar ?

Yes, I believe so. They mention that the sensors have a self-cleaning system that employs nozzles, wipers etc:

Because we are designing a system that travels hundreds of miles a day, we developed a robust automated onboard cleaning system that employs various cleaning tools — like custom nozzles, wipers, coatings and more—to keep our sensors clean and the Waymo Driver on the road in some of the harshest environmental conditions. We’ve also developed a comprehensive sensor cleaning policy that detects and dictates when and how to clean the sensors and under what conditions, all without human intervention.

Also, here is the wipers in action for the lidar on the 4th Gen. I imagine the 5th Gen would have something similar.


Since the weather is same for Lidar and the camera, they can just use camera to detect weather too - like Tesla does.

Sure. I guess they could.