Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Definitely, since the beginning of the video series I've been super focused on throwing all manner of weird edge cases I can find at it. Though it's tough since you can only pick the destination/pickup, I have no influence over the route.

Mostly I found that interesting events still occur if I just pick a random destination and go, so nowadays that's mostly what I do. Unless I'm doing a demo for a friend or guest, then I do the Costco-In N Out loop that's guaranteed to include heavy pedestrian traffic/tough turns.
 
Ah, small detail but I actually hired the voiceover guy since I'd been looking for an excuse to, and like his voice. Danny Harmon, been following his YouTube for the better part of a decade. Lovely guy. But yeah anyway, definitely a weird video; lots to analyze
Does Waymo actively avoid construction zones? Roadside assistance said something like “they haven’t taken that [construction zone] off the map yet?
 
...Third, I completely disagree with Waymo that stopping dead in the middle of the road was a safe response to the situation. I thought their vehicles were capable of an appropriate dynamic driving task fallback? With the vehicle stopped dead in the middle of the road, it's just a matter of time until an inattentive driver rear-ends the Waymo. ...
Agree, and it doesn't even take an inattentive driver, it can be a skilled but thoughtless one. I've seen and personally experienced the situation in which
  • A car is stopped, for whatever reason, in a relatively high-speed lane.
  • A 2nd car approaches, its driver realizes in plenty of time that the lane is blocked.
  • But instead of slowing, creating a helpful brake light / turn signal / hazard flasher signal, he cleverly gauges the distance, checks his blind spot, and executes a masterful and smooth lane change at the very last moment to clear the stopped car without missing a beat.
  • Ostensibly skilled, but incredibly dangerous:
    • As it suddenly presents any 3rd or 4th following vehicle(s) with the stopped vehicle, requiring emergency braking or avoidance maneuvering.
    • Or worse, the first car had stopped because there was a wayward pedestrian, animal, child or scooter crossing unexpectedly - setting up a potentially deadly outcome.
Admittedly the very last example is kind of beyond the Waymo discussion, but the general point is that stopping in the traffic lane is basically not a safe situation. Sure it might be necessary or the lesser evil, but it should be treated as a pretty hazardous maneuver, and probably should be accompanied by some flashing signals and loud warning sounds.
 
Adding higher resolution sensors is not a marketing play. It clearly gives the car better perception. Do you really think that lower resolution is better? And we still don't know if Tesla's camera sensors are good enough for L4 or L5. But perception is not everything. Even with excellent perception, the car still needs good planning. The issue we see in the video has nothing to do with bad perception. The Waymo detected all the cones perfectly. The problem was the planner that was not sure what lane was open.
Allready 2-3 years ago Amnon Shashua said “sensing is done”, meaning the problem is the driving policy and planning. I think that this video makes it clear.
 
Allready 2-3 years ago Amnon Shashua said “sensing is done”, meaning the problem is the driving policy and planning. I think that this video makes it clear.

Definitely.

The fact is that perception is not why autonomous driving is so challenging. Autonomous driving is so challenging because of planning and prediction. Road users can sometimes act impulsively or randomly which makes it hard to predict their behavior. For example a pedestrian that sees a friend across the street and decides to suddenly run across the street to meet them without looking both ways. This can complicate planning. There can also be cases like with construction zones or an accident where you have to actually ignore the lane lines and figure out a different path. Computers tend to be very rigid in their thinking. So training a computer to think outside the box is a challenge.

Waymo has good perception. The problem is planning and driving policy because the car has to figure what to do based on what it sees. In this case, the Waymo saw the cones just fine but the driving policy and planning was confused. It seemed to want to drive in the right lane but was not sure if the lane was open or not. The confusion was compounded by remote assistance giving the car bad info, telling it that yes, it should drive in the right lane that was closed. I wonder if the spacing of the cones was part of the problem. If the cones had been closer together, it might have been more obvious which lane was closed since the car would not be able to drive in between the cones. But since the cones were spread out, it was possible for the car to drive in between the cones so the car may have thought the lane was open.
 
The confusion was compounded by remote assistance giving the car bad info, telling it that yes, it should drive in the right lane that was closed.

I don't think we know exactly which piece of remote assistance was incorrect, do we? I figured the incorrect assistance may have been given when the car first paused just before the right-hand turn into the construction zone; something like the car sending a "I'm uncertain if it's safe to proceed; should I stay here or continue the turn?" signal and the remote operator accidentally hitting "Proceed" instead of "Stop." Everyone seemed surprised when the safety driver was ~2 minutes away and the Waymo suddenly started up again.
 
  • Like
Reactions: Doggydogworld
I don't think we know exactly which piece of remote assistance was incorrect, do we? I figured the incorrect assistance may have been given when the car first paused just before the right-hand turn into the construction zone; something like the car sending a "I'm uncertain if it's safe to proceed; should I stay here or continue the turn?" signal and the remote operator accidentally hitting "Proceed" instead of "Stop." Everyone seemed surprised when the safety driver was ~2 minutes away and the Waymo suddenly started up again.

True. That is a fair point. We don't know exactly what the question was. Waymo car might have asked "do I proceed?" and Remote Assistance said "Yes", not checking that the Waymo wanted to turn into the right lane that was closed. But it is also logical that maybe the Waymo car asked "do I turn into the right lane?" since that is what the Planner wanted to do and Remote Assistance accidentally said "Yes", not thinking that it was closed.

I think what we do know is that the Waymo hesitated to make the right turn because of the construction zone. We do know from the path planning on the screen, that the Waymo appeared to want to turn into the closed right lane. Remote Assistance gave some type of bad instruction. The Waymo tried to turned into the closed right lane and then realized it was in a closed lane and tried to extricate itself from the lane.

It does seem that Remote Assistance never instructed the Waymo car to fully stop. Hence, the Waymo Driver was still trying to drive. Hence, why we see it start to move again and start to drive away from Roadside Assistance when they show up. The Waymo Driver was still "on" so it was still trying to drive.
 
  • Like
Reactions: willow_hiller
Waymo has three separate groups:

Fleet Reponse - Monitors the fleet, provides path suggestions when the car asks. Doesn't joystick the car, as this video clearly shows.
Rider Assistance (aka Rider Support) - The lady on the speaker. Interacts with riders, answers questions about the car, the trip, billing, etc.
Roadside Assistance - Employees out on the road who show up when a van breaks down or goes rogue.

Let's use these names to cut down on confusion. The three groups obviously weren't on the same page here. I've worked in giant, bureaucratic organizations and in tiny entrepreneurial startups. Waymo is the former. They developed great autonomous technology years ahead of anyone else but completely botched the rollout. That's on Krafcik, and IMHO is why he's gone. Imagine the thousands of long meetings to hash out all the processes and training procedures just for these three groups. And they still get it horribly wrong, because they're trying to handle problems they haven't yet faced instead of "moving fast and breaking things". Contrast with Elon Musk who has 50k employees but still tries Bitcoin payment on a whim. Oops, the greenie customer base doesn't like BTC's massive energy waste? OK, cancel that idea and move on to the next one.

Is it irresponsible to put a laughably inept FSD Beta into the hands of end users? Perhaps. But it keeps the ball moving forward. Waymo spends the entire pre-season in the classroom drawing plays on whiteboards, then when the game starts they all run into each other trying to execute a basic handoff.

The problem with assessing Waymo is that we don't know when remote assistance is making a decision vs the software.
Maybe JJRicks can chime in, but I think he's said in the past the display changes.

It is interesting that you've personally experienced 4 disengagements over about 1,100 miles of rides.
Waymo's 30k mile metric is for "safety related disengagements". Companies have different definitions of safety-related, which makes the reporting pretty useless. Apple apparently used to count all kinds of stuff others ignored.

Does Waymo actively avoid construction zones?
I certainly hope so! Don't you? Just last night I exited the highway to avoid a really bad one. Unfortunately there was a different construction zone on the frontage road, so I still got stuck for 5 minutes...
 
Btw system is not financially viable if they need 1 roadside assistance to be within few minutes of every car. When cars themselves are so expensive, they would need to function completely without any close support.

I agree. But Roadside assistance is not a permanent feature. It is a temporary measure while they improve the Waymo Driver. Right now, there are some cases where the Waymo Driver needs some assistance. And it is only needed for driverless rides since drives with a safety driver can be handled by the safety driver if the car gets stuck. Eventually, Waymo Driver will be good enough that it will never need any assistance and then Waymo can remove all roadside assistance.
 
The Waymo approach seems intractable at this point. It’s as Karpathy says, it’s better to sacrifice sensing for scale.

Waymo will be having all these meetings and discussions about what spacing and formation of cones dictates this or that maneuver, whereas a company that has scale can just look at all the different configurations that have been encountered all across the country and make decisions based on that.
 
Eventually, Waymo Driver will be good enough that it will never need any assistance and then Waymo can remove all roadside assistance.
Really, you're absolutely sure about that? There is no chance that Waymo fails?

Of course we know they will never be able to "remove all roadside assistance" as cars break down so at a minimum they will need to keep them around for flat tires and other unexpected break-downs.
 
I agree. But Roadside assistance is not a permanent feature. It is a temporary measure while they improve the Waymo Driver. Right now, there are some cases where the Waymo Driver needs some assistance. And it is only needed for driverless rides since drives with a safety driver can be handled by the safety driver if the car gets stuck. Eventually, Waymo Driver will be good enough that it will never need any assistance and then Waymo can remove all roadside assistance.
Yes. But @JJRicks ride has needed Roadside assistance 4 times over over about 1,100 miles. So approximately one every 275 miles the car gets stuck and needs a roadside assistance to rescue. To be able to completely get rid of close roadside assistance,I think that number needs to be at least couple of magnitudes smaller.
 
Really, you're absolutely sure about that? There is no chance that Waymo fails?

Of course we know they will never be able to "remove all roadside assistance" as cars break down so at a minimum they will need to keep them around for flat tires and other unexpected break-downs.
In case of flat tire etc they can send another robotaxi to get the passenger and tow truck to get the car.
 
Let's also keep things in perspective. Waymo does hundreds of rides per week and thousands of miles with no issue.
Yes - no point crying over one incident. Definitely state of the art FSD has ways to go. Afterall Tesla says for L5 you need 1 disengagement in a Million miles not 30k. Moreover, I'm sure they will learn from this incidence. Always best to learn from incidents that don't cause any harm.
 
Last edited:
If we start fitting all the recent Waymo news together, it doesn't look good for the Waymo approach:

Long time CEO, CFO, partnerships guy, etc. leaving the company

No service expansion in the Chandler area

This big messy disengagement, where there was mass incompetence and people having no idea what's going on or how to stop the car?

If Waymo can't get their act together with HD maps, the best machine learning experts, and the most experience, how do we expect any of these other Waymo approach companies to succeed? Heck, can Tesla beat Waymo's performance in a general approach? Even if Tesla does beat Waymo's performance in a general approach, they'll need to be far better (than one disengagement every ~1000 miles) to start a robotaxi service.
 
The Waymo approach seems intractable at this point. It’s as Karpathy says, it’s better to sacrifice sensing for scale.

Waymo will be having all these meetings and discussions about what spacing and formation of cones dictates this or that maneuver, whereas a company that has scale can just look at all the different configurations that have been encountered all across the country and make decisions based on that.

Its not surprising that you would be making this completely wrong conclusion that has no basis in fact.
First of all, the better sensing, Lidar and 4D radar and Hd map does not prevent scale.
I will repeat, Huawei will have a door to door system that will function anywhere in china in 7 montha. An environment that is orders of magnitude more harder to drive in than the US. They use HD maps, radars and lidars. Better sensing doesn't prevent scale. That's misinformation and FUD.

What has been proven is that the 3 different teams was on different pages and had no direct communication.
So while remote assistance was giving instructions to go and never to stop or pull over.
The Rider Assistance was trying to get Roadside assistance to complete the drive in manual mode.

Its was a complete disconnect. This isn't a failure of the Automated Driving System, it was a failure of the infrastructure and logistics that Waymo setup around the Waymo Driver.

But delusional people like you think that Tesla can just turn a switch and there will be millions of L5 cars gracefully driving everywhere in all road and weather conditions with human level driving intelligence and response. Meanly they won't ever get stuck, run amok or need assistance of any kind.
 
Yes. But @JJRicks ride has needed Roadside assistance 4 times over over about 1,100 miles. So approximately one every 275 miles the car gets stuck and needs a roadside assistance to rescue. To be able to completely get rid of close roadside assistance,I think that number needs to be at least couple of magnitudes smaller.

But none of them has been safety related, how is people unable to understand this?