Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
It has to be very expensive to house the Employees and find suitable Buildings to rent or purchase for the storage and maintenance of these vehicles. Putting Employees in Hotels or renting Apartments for Employees will not be cheap.
Google does have an office tower in midtown they could keep the cars for now. Of course that wouldn't help with people. But hotels are just an expense. They may even use local employees.

EDIT: Also as a funny they should hire me. I'm a runner and know and run on every street and road in Midtown, Downtown, Westside, GA Tech and Buckhead.

IMG_4037 (1).jpeg
 
  • Like
Reactions: Doggydogworld
Drago on Tesla et.c:

"in terms of company like Tesla that deployed more driver assist type technologies I appreciate the machine learning work and Innovations they do I think to me in their nature the system is not full self driving, right.

They do good machine learning but as with most machine learning I'm machine learning person and as much as I appreciate it,
there comes a time when you get to situations that extremely rare and or machine learning cannot handle and to have real deployment out there you need to carefully think through your stack to make sure that even if machine learning does not solve everything 100% your full software product does and that's a very big gap.

I think that's not easy to do at all and it leads potentially to rethinking core parts of your design if you are to go there and so I I mean all
of our machine learning models improving yet to to release a driverless service you need to also answer the question of
how how to make sure the whole thing is fully robust and that adds a level of design and complexity that a lot of
those companies have not tackled yet"

Sums it up 100% to me.
 
  • Like
Reactions: diplomat33

Thanks for sharing the video. I am only 20 minutes in but it is already super interesting. Anguelov really knows his stuff and he is able to get technical while still making it easy to understand.

A couple things I found noteworthy in the parts I have watched so far:

Anguelov mentions that they are building a very generalized driver to work in many different forms. He hints at robotics. I wonder if Waymo would do a Nuro like delivery bot? Also, I think it definitely leaves the door open for Waymo to do some sort of L4 product for consumer cars in a few years, something like a "eyes off highway" system for personal cars.

He also mentions that autonomous cars are a type of robots but that AVs need quicker reaction times, need much higher safety and need to be able to respond to a dynamic environment with lots of other objects.

When asked how much heuristic code Waymo uses, he did not give a percentage but he says that every piece of the stack (perception, prediction, planning) is built on large ML models. So it sounds to me like there is very little heuristic code, probably just some "glue code" between the NN. From his description, it sounds like the actual driving relies on very large ML models.

His comments on end-to-end were interesting. He says that there has been a trend towards consolidating lots of smaller NN into fewer, bigger ones. He mentions the advent of transformers and also the ability of new large models to be able to do many tasks at the same time really well.

While Waymo is consolidating ML models into bigger ones, he says end-to-end would present 2 problems for Waymo:
1) Validation. He says Waymo has 1000s of safety requirements that need to be met and doing that validation with a single "black box" end-to-end would be difficult.
2) Compute. Waymo uses a lot of sensors. He mentions two dozen cameras, plus lidar and radar. The onboard computing requirements for an end-to-end stack with that many sensors would be huge.

They do good machine learning but as with most machine learning I'm machine learning person and as much as I appreciate it,
there comes a time when you get to situations that extremely rare and or machine learning cannot handle and to have real deployment out there you need to carefully think through your stack to make sure that even if machine learning does not solve everything 100% your full software product does and that's a very big gap.

Yep. This has been my criticism of Elon's vision-only end to end approach. It lacks the robustness needed for safe deployment of true L4. Elon seems to think you can just train vision-only until it is 100%. It does not work like that. One ML model will never be 100% reliable. That is why you need built-in redundancies to make sure that when one model fails, your entire stack still works.
 
  • Like
Reactions: spacecoin
His comments on end-to-end were interesting. He says that there has been a trend towards consolidating lots of smaller NN into fewer, bigger ones. He mentions the advent of transformers and also the ability of new large models to be able to do many tasks at the same time really well.
Yes. The explanations of the value of "intermediate representations" starting at around 23:55 to 25:40 was great.
 
Last edited:
  • Like
Reactions: diplomat33
Here is some of the worst Waymo driving I've seen. The unicycles (plus a couple scooters) were moving fast enough that Waymo should have just stayed in the lane and followed them. Instead it crossed the double yellow and drove toward oncoming traffic for 40 seconds! It only got back in the legal lane when one uni went in front of the car and slowed while others opened a gap for it to move into.

Even if Waymo misclassified the unis as pedestrians, it knows their speed. Why try to illegally pass? I can't imagine it'd do that if a bunch of 15-20 mph cars, motorcycles or bicycles were in front. At least I hope it wouldn't.
 
Here is some of the worst Waymo driving I've seen. The unicycles (plus a couple scooters) were moving fast enough that Waymo should have just stayed in the lane and followed them. Instead it crossed the double yellow and drove toward oncoming traffic for 40 seconds! It only got back in the legal lane when one uni went in front of the car and slowed while others opened a gap for it to move into.

Even if Waymo misclassified the unis as pedestrians, it knows their speed. Why try to illegally pass? I can't imagine it'd do that if a bunch of 15-20 mph cars, motorcycles or bicycles were in front. At least I hope it wouldn't.

Yeah, that was definitely very bad driving on Waymo's part. Waymo has generally been a much driver than that so I am surprised and disappointed that it would make a mistake like that. Waymo is fortunate that was no accident. My guess is maybe the chaotic nature and the number of unicycles was a factor. There were so many unicyles and they were taking up the entire road and behaving very erratically. That is a potential explanation but not an excuse. I agree the Waymo should have just stayed behind them. There was no need to pass them, plus it was illegal too.

The only possibility that I can think of that might justify Waymo's actions is if Waymo felt it had to move over to avoid a collision. If the unicycles were moving too slow where there was a chance of the Waymo rear ending them, plus the unicycles were taking up the entire road so there was nowhere in the lane to go to avoid them. So Waymo felt it needed to move over to avoid hitting it.
 
Last edited:
The only possibility that I can think of that might justify Waymo's actions is if Waymo felt it had to move over to avoid a collision. If the unicycles were moving too slow where there was a chance of the Waymo rear ending them, plus the unicycles were taking up the entire road so there was nowhere in the lane to go to avoid them. So Waymo felt it needed to move over to avoid hitting it.
Even in your scenario the Waymo got itself into the situation, it shouldn't have even been possible for it to be close enough to "rear end" them if it was being safe. Also what happened to defensive driving? "Someone's going slow, better illegally pass them" isn't defensive or safe.

I doubt it was that though, I think the Waymo went into "I'm gonna pass a bike" mode and just never stopped because.... some reason we won't ever know (likely not planning far enough ahead to see multiple "bikes"). You can see when it starts the first one is close to the right like a bike would be driving.

Now I'm curious if a Waymo would drive into oncoming traffic to pass a peloton.

ETA: Another video of the same incident that gives better perspective of it imo.
Plus another clip I haven't seen posted here of a Waymo starting to pull out into high-speed traffic and stopping while likely encroaching into the oncoming lanes.

 
Last edited:
  • Like
Reactions: Doggydogworld
He also mentions that autonomous cars are a type of robots but that AVs need quicker reaction times, need much higher safety and need to be able to respond to a dynamic environment with lots of other objects.

With response time and safety short of an attentive human, your left with an overpriced novelty even for L2.

I would love to know their max response time requirement. I would guess less than 500ms.
 
  • Like
Reactions: flutas
Even in your scenario the Waymo got itself into the situation, it shouldn't have even been possible for it to be close enough to "rear end" them if it was being safe. Also what happened to defensive driving? "Someone's going slow, better illegally pass them" isn't defensive or safe.

I doubt it was that though, I think the Waymo went into "I'm gonna pass a bike" mode and just never stopped because.... some reason we won't ever know (likely not planning far enough ahead to see multiple "bikes"). You can see when it starts the first one is close to the right like a bike would be driving.

Now I'm curious if a Waymo would drive into oncoming traffic to pass a peloton.

ETA: Another video of the same incident that gives better perspective of it imo.
Plus another clip I haven't seen posted here of a Waymo starting to pull out into high-speed traffic and stopping while likely encroaching into the oncoming lanes.

Yikes on that left turn. Looks like Waymo is trying to use neural nets for planning too. How hard is it to calculate future positions of vehicles when you know their exact speed and trajectory? Apparently an impossible AI problem.
 
Another interesting thing from today.

Stop signs printed on T-shirts are an attack vector for Waymo vehicles (not that I'm surprised, this is one of those "how the hell do you handle this" scenarios). Also gives me a fun experiment to try on FSD (not a printed stop sign shirt though).

This seems weird as both Lidar and Radar and also hd-maps (the latter might be ignored for roadwork etc) would likely mitigate a simple image attack where passive sensors might be susceptible.

I guess better safe than sorry.
 

FYI, here are the timestamps for the video if anyone is interested in jumping to a specific topic:
  • 04:13 - Drago's History
  • 07:16 - Drago's Education
  • 08:28 - Waymo's Development
  • 13:12 - Hard Coding & Machine Learning
  • 14:42 - Local vs Cloud Models
  • 15:57 - Number of Models
  • 19:04 - Progression of Stack Tasks
  • 22:36 - Wayve's Approach
  • 25:39 - Robotic Transformer 2
  • 26:18 - Other Applications of Waymo's Tech
  • 30:05 - Integration of New Technologies
  • 32:10 - Handling Corner Cases
  • 34:09 - Waymo's Open Data Set
  • 39:07 - Open Data Set Collection
  • 42:04 - Vehicle Data Sharing
  • 43:03 - Model Generalization
  • 43:45 - Model Generalization Across Regions
  • 45:38 - Cruise Incident
  • 49:39 - Chinese Competition
  • 51:29 - Tesla's Driver Assist
  • 52:50 - Waymo Personal Vehicles
  • 54:40 - City Expansion and Highways
  • 56:27 - Exciting Developments
  • 59:01 - Waymo One Affordability
 
Yikes on that left turn. Looks like Waymo is trying to use neural nets for planning too. How hard is it to calculate future positions of vehicles when you know their exact speed and trajectory? Apparently an impossible AI problem.

Nothing is an impossible AI problem forever.

It is not too hard to calculate future positions of vehicles when you know their exact speed and trajectory. The Waymo stack does this every millisecond with high accuracy. In the video, we see the Waymo start to creep forward and then stop. I think the Waymo initially calculated the future positions accurately of the vehicles and determined it was safe to make the turn. So the planner told the car to proceed but the Waymo moved too slowly. This was a case where if the Waymo had "gunned it" immediately, I think it could have made the turn safely but it hesitated a fraction of second and that was enough for the turn not to be safe anymore. The Waymo recalculated the future positions of the vehicles accurately and determined that it was no longer safe so the planner told the car to stop. The issue is not calculating future positions of vehicles, the issue is the planner and the controls to execute the turn. The Waymo was moving too slow to clear the turn so it needed to reverse its decision and stop. If anything, the Waymo's accurate behavior predictions prevents a collision because it was able to determine in a split second that the Waymo was moving too slow to clear the turn safely and therefore it needed to stop instead of attempting the turn. If my interpretation is correct, then this is a planner issue where the planner was too hesitant, not a behavior prediction issue. In other words, The Waymo planner is trying to have it both ways. It picks the gap that you can clear if you make the turn assertively but then proceeds to make the turn too cautiously where it misses its chance and has to stop half way, blocking traffic. I think it is better to be one or the other: either be assertive and go for it or be cautious and wait for a bigger gap.
 
Last edited:
  • Like
Reactions: Daniel in SD
The only possibility that I can think of that might justify Waymo's actions is if Waymo felt it had to move over to avoid a collision. If the unicycles were moving too slow where there was a chance of the Waymo rear ending them, plus the unicycles were taking up the entire road so there was nowhere in the lane to go to avoid them. So Waymo felt it needed to move over to avoid hitting it.

My theory might not be that far off if you believe Waymo. They told the SF Chronicles that the car did the maneuver on purpose for safety reasons.

Here is Waymo's PR response to the incident:

"Waymo told the Chronicle in a statement that the robotaxi “detected that there may be a risk of a person within that crowd who had fallen down, and decided to carefully initiate a passing maneuver when the opposing lane was clear to move around what could be an obstacle and a safety concern.”
“After starting that maneuver, out of an abundance of caution around these vulnerable road users, and to avoid getting too close or cutting them off, the Waymo remained in the oncoming lane for longer than necessary before returning to its original lane of travel,” the company said. “The safety of all road users is a top priority for Waymo, and we look forward to learning from this unique event.”"