Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Some challenging cases in this video with some busy traffic, construction zones and in foggy, dark conditions. First construction zone did involve remote assistance helping the car with routing. We can see from the path on the screen that the car wanted to go straight but the path was blocked due to construction. Remote assistance tells the car to turn right.


0:00 Introduction
0:19 First construction site
0:59 Remote assistance request & traffic behind
1:41 New route from remote assistance
2:16 Rider support call
5:46 Blocking the crosswalk, but not the intersection
6:30 Proceeding on yellow light with cut-in
7:29 Yielding to vehicle with intent to merge
9:11 Second construction site
10:41 Driving on freeway ramp 1 of 2
13:33 Driving on freeway ramp 2 of 2
14:50 Construction debris mapped as off-road
15:00 Pull over
16:21 Pedestrian in street
16:33 Not technically a u-turn
17:10 Yielding to pedestrians with intent to cross
18:55 Cool tree lights in the fog
19:07 Lane change for unknown
19:57 Pull over
 
Some challenging cases in this video with some busy traffic, construction zones and in foggy, dark conditions. First construction zone did involve remote assistance helping the car with routing. We can see from the path on the screen that the car wanted to go straight but the path was blocked due to construction. Remote assistance tells the car to turn right.
0:15 - Car in front clears the path, but Waymo wants to veer left where the truck with the > > > sign is then wander through the construction zone
0:17 - Planned path briefly gets it "right" (right turn), but immediately switches back into construction zone rampage mode
1:05 - Screen shows "Our Team is working to get you moving" -- remote assistance responding to request from vehicle
1:41 - Remote assistance locks in new path (right turn)
1:50 - "You're back on your way". Waymo starts moving, then stops for pedestrians.
2:06 - Rider gets notice on app that Roadside Assistance (human driver) is 7 minutes away
2:12 - Pedestrians clear crosswalk, Waymo starts executing right on red
2:16 - "Connected to Rider Support" (voice call)
2:30-3:00 - Rider support lady gets notification that car is no longer blocked, says goodbye and hangs up
9:06 - Roadside Assistance arrives, wonders where car is (OK, I made this part up)

The same three parties involved in Conegate are active here, and one hand still doesn't know what the others are doing. This result is better because Remote Assistance chose the correct course. And Rider Support did get the news the car is moving again, it just came 40-50 seconds late instead of several minutes late. Progress!

I assume Roadside Assistance was also notified the car was moving again, and returned to base. Anyway, it seems it took 40 seconds for Remote Assistance to get involved and another 45 seconds for them to choose the correct course and lock it in. At this point the event should have ended, but Roadside Assistance somehow got activated 16 seconds after the car was moving again and Rider Support called in to talk to the riders another 10 seconds after that.
 
That implies to me that the Waymo might stop or it might take some other action it deems appropriate if the human is not available to answer. Basically, the Waymo is programmed to make a decision on its own when remote assistance is not available. So we cannot say that the Waymo would always stop if remote assistance is not available.

Some challenging cases in this video with some busy traffic, construction zones and in foggy, dark conditions. First construction zone did involve remote assistance helping the car with routing. We can see from the path on the screen that the car wanted to go straight but the path was blocked due to construction. Remote assistance tells the car to turn right.


0:00 Introduction
0:19 First construction site
0:59 Remote assistance request & traffic behind
1:41 New route from remote assistance
2:16 Rider support call
5:46 Blocking the crosswalk, but not the intersection
6:30 Proceeding on yellow light with cut-in
7:29 Yielding to vehicle with intent to merge
9:11 Second construction site
10:41 Driving on freeway ramp 1 of 2
13:33 Driving on freeway ramp 2 of 2
14:50 Construction debris mapped as off-road
15:00 Pull over
16:21 Pedestrian in street
16:33 Not technically a u-turn
17:10 Yielding to pedestrians with intent to cross
18:55 Cool tree lights in the fog
19:07 Lane change for unknown
19:57 Pull over

Nothing like an example posted the same day of the car literally just stopping in the middle of the road for a minute and a half waiting for remote assistance.
 
  • Like
Reactions: Yelobird
Nothing like an example posted the same day of the car literally just stopping in the middle of the road for a minute and a half waiting for remote assistance.

I am going by what the Chief Product Officer said "it’s designed to do the right thing even when support isn’t available". I interpret that to mean the car will not always stop every time. So the car happened to stop in this case because the Waymo Driver determined that stopping and asking remote assistance for guidance was the right thing to do but in another case, it might not stop if it deems that stopping is not the right thing to do.
 
Last edited:
I am going by what the Chief Product Officer said "it’s designed to do the right thing even when support isn’t available". I interpret that to mean the car will not always stop every time. So the car happened to stop in this case because the Waymo Driver determined that stopping and asking remote assistance for guidance was the right thing to do but in another case, it might not stop if it deems that stopping is not the right thing to do.
Yes, but that whole conversation started because you made grandiose claims about Waymo

In terms of autonomy, it is completely irrelevant how often it happens since it does not make the car any less autonomous.

someone clarified

I believe what most are referring to is IF this human wasn’t there to answer the cars beckon for help/advise it would stop unlike an aircraft.

and then you try to downplay it by just quoting a C suite. In the same vein, I could quote Elon about FSD all I want, but it doesn't make it true either (as much as he wishes it would.)

I don't think we can assume that. Waymo's CPO says that "it’s designed to do the right thing even when support isn’t available". That implies to me that the Waymo might stop or it might take some other action it deems appropriate if the human is not available to answer. Basically, the Waymo is programmed to make a decision on its own when remote assistance is not available. So we cannot say that the Waymo would always stop if remote assistance is not available.

To me, I never saw the Waymo trying "to make a decision on its own" there. It froze for ~45s doing nothing, then decided to contact remote support, and was stuck there for another 45s.

Now where would the "make it's own decision" branch come up here, was that the 45 second wait at the start, before it contacted support or would it be later after sitting for an amount of time?
 
Yes, but that whole conversation started because you made grandiose claims about Waymo

What grandiose claims? That Waymo is currently ahead in autonomous driving? Waymo has the most driverless or unsupervised FSD miles. And Waymo has done 700,000 driverless rides this year across 2 metro areas, 5M+ driverless miles this year, more than anybody else has done. So yes, they are ahead. That is not a grandiose claim.

To me, I never saw the Waymo trying "to make a decision on its own" there. It froze for ~45s doing nothing, then decided to contact remote support, and was stuck there for another 45s.

It did not freeze for 45 seconds doing nothing. It was stopped at a red light for most of that time. It contacted remote support when the light turned green and it was not sure what to do.

Now where would the "make it's own decision" branch come up here, was that the 45 second wait at the start, before it contacted support or would it be later after sitting for an amount of time?

He says if remote assistance is not available, it will make it's down decision. In this case, it was able to contact remote assistance. We can assume that if remote assistance had not responded, the car would have made it's own decision after a moment.
 
Last edited:
What grandiose claims? That Waymo is currently ahead in autonomous driving? Waymo has the most driverless or unsupervised FSD miles. And Waymo has done 700,000 driverless rides this year across 2 metro areas, 5M+ driverless miles this year, more than anybody else has done. So yes, they are ahead. That is not a grandiose claim.
First of all I literally quoted you and what I claimed it to be, but since you're allergic to reading your own words and need to try to spin everything into "WAYMO IS JEEBUS CONFIRMED":

No, they questioned how often Waymo is having to give manual assistance, just like Cruise was caught having to do every ~5 miles but "oh we didn't disengage, just told it what to do manually while it had no idea what to do, clearly that's not a disengagement!" You then jumped straight to "well they *say* it doesn't happen and I believe them without a doubt." That is the grandiose claim.

It did not freeze for 45 seconds doing nothing. It was stopped at a red light for most of that time. It contacted remote support when the light turned green and it was not sure what to do.

At :13 the cars pathing is a direct drive forward with the car in front of it highlighted green.

At :14 this changes to the truck to it's left being highlighted in green (i.e. a vehicle it's using for something) and the pathing showing it hitting the truck and running over 4 cones.

As for the "it's waiting for a red light" you can even SEE the stop line when the car stops from the back seat, and the car is at least a full car length behind the stop line as evidenced by the truck being rendered as before the stop line, even though the waymo is literally behind where the truck is in the adjacent lane).

If the car was waiting for a red light....wouldn't it move (even just twitch) when the red light changes?

So yeah, that's a stall. That's not "waiting for a red light" as much as you try to spin it.

He says if remote assistance is not available, it will make it's down decision. In this case, it was able to contact remote assistance. We can assume that if remote assistance had not responded, the car would have made it's own decision after a moment.
We can't "assume" anything. As my mom would always say, when you assume you just make an ass out of you, me and anyone who believes you. Assumptions are how idiots are made, just like Elon assumed FSD would be a robotaxi by now. 😉
 
Thats nonsense. Don't go by the stupid levels - use common sense.

If the car stops every 100 ft for 1 1/2 minutes, its *way* less autonomous than a car that stops every 100k miles.
The autonomous levels are what everyone making these systems go by. If the car stops every 100 ft for 1 1/2 minutes, common sense dictates that, the vehicle is a bad autonomous vehicle but still autonomous, nonetheless. The levels do not dictate how good or bad an ADS is, just the role of the driver/passenger when feature is engaged.
 
Thats nonsense. Don't go by the stupid levels - use common sense.

The levels are not stupid, they are how the entire AV industry define autonomy. So yes, we should use the levels because we should use the definitions that the industry uses. We should not make up our own definitions. What you call "common sense" is just code for "I am going to make up whatever definition I want that suits my Tesla agenda". I know the SAE levels are inconvenient to Tesla fans but the whole reason we have industry standards like the SAE levels is precisely so that everybody does not just make up their own personal definitions.

If the car stops every 100 ft for 1 1/2 minutes, its *way* less autonomous than a car that stops every 100k miles.

As Bitdepth correctly pointed out, if the car stops every 100 ft but is still in autonomous mode, then it is bad autonomy, but still autonomy. It is not less autonomy, it is just bad autonomy that would not practical for a commercial product.

For example, if a horse loses every race, it is a bad race horse, but it is still a horse. Being a horse is a biological characteristic that does not change if the horse is fast or not. Likewise, autonomy is a fundamental engineering characteristic of a driving system. It does not change if the system is unreliable. You are trying to redefine autonomy based on how reliable or practical it is as a commercial product but that is not how autonomy is defined. You cannot just make up your own definitions that suit you.

 
  • Like
  • Disagree
Reactions: flutas and Bitdepth
The autonomous levels are what everyone making these systems go by.
As Bitdepth correctly pointed out, if the car stops every 100 ft but is still in autonomous mode, then it is bad autonomy, but still autonomy. It is not less autonomy, it is just bad autonomy that would not practical for a commercial product.
The levels are not stupid,
The levels are stupid - as you guys prove yourselves above.

Widespread usage doesn't make something not stupid - like using leaches to cure diseases in the middle ages proofs.
 
The levels are stupid - as you guys prove yourselves above.

Widespread usage doesn't make something not stupid - like using leaches to cure diseases in the middle ages proofs.
Please inform yourself before calling things stupid. Leeches are still used in modern medicine. Leeches are used to promote circulation of blood in a specific area, because they secrete hirudin and calin which prevent blood clotting, post-surgical procedure, etc.

 
Last edited:
  • Disagree
Reactions: EVNow and bkp_duke
Please inform yourself before calling things stupid. Leaches are still used in modern medicine. Leeches are used to promote circulation of blood in a specific area, because they secrete hirudin and calin which prevent blood clotting, post-surgical procedure, etc.


What a load of horseshit - I'm a Physician, and no one in any developed world healthcare system uses leaches anymore. Labs still study them, occasionally, but they are NOT APPROVED by ANY governing body for medical use in the US.

What you quoted was a research article, of basically "theoretical uses", and nothing more.
 
  • Like
Reactions: Yelobird
What a load of horseshit - I'm a Physician, and no one in any developed world healthcare system uses leaches anymore. Labs still study them, occasionally, but they are NOT APPROVED by ANY governing body for medical use in the US.

What you quoted was a research article, of basically "theoretical uses", and nothing more.
I'm not a physician but I'm a lab tech and yes leeches are approved by the FDA and NHS for use.

 
Last edited:
  • Funny
Reactions: bkp_duke and EVNow