Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register
  • Want to remove ads? Register an account and login to see fewer ads, and become a Supporting Member to remove almost all ads.
  • Tesla's Supercharger Team was recently laid off. We discuss what this means for the company on today's TMC Podcast streaming live at 1PM PDT. You can watch on X or on YouTube where you can participate in the live chat.

Waymo

This site may earn commission on affiliate links.
That is a pretty crazy move. Waymo performed it safely but I am not sure it was wise though. Ideally, I think you want to drive in a predictable way for other drivers. So if you start to make a right turn, you should complete the right turn as expected, not suddenly make a left turn in the middle of the intersection when the oncoming traffic has a green light to go.
Only problem is the Waymo was turning left not right. Stopping a right turn and continuing straight isn't bad. Stopping a left turn 90% of the way through and deciding to go straight...well I think Waymo would much prefer the car to get stuck than to drive head on towards traffic while on the wrong side of the road...
 
Looks like their collisions are about the same. 71 in 2022, 50 in 2023 and 14 (x3 = 42) so far this year. Though I didn’t look to see if how many are with a safety driver driving (safety drivers seem to get in dumb low speed collisions).
But "failure rate" <> "collision". That weird way of aborting the turn will not come in any publicly available stat ...
 
Only problem is the Waymo was turning left not right. Stopping a right turn and continuing straight isn't bad. Stopping a left turn 90% of the way through and deciding to go straight...well I think Waymo would much prefer the car to get stuck than to drive head on towards traffic while on the wrong side of the road
The video doesn't give the whole story - we don't really know why the Waymo initially changed its mind - maybe it got stuck in the intersection because of traffic ahead of it, and decided to find a safe solution rather than blocking the intersection.

A human driver may have done exactly the same thing, in which case 10 points to the Waymo.

However, a good human driver would have presumably ensured the path into the left turn street was clear before proceeding into the intersection - and so should have Waymo, in which case -10 points.

What do we get more annoyed about? Waymo blocking traffic due to a dumb decision, or Waymo rescuing itself from a dumb decision? Either is not good - better not to make the dumb decision in the first place. But then we would get annoyed about it driving too slowly and being too slow about decisions and holding up traffic. A certain amount of prediction is involved in efficient driving by humans, we don't always get it right, and if we want these machines to drive with us and around us we need them to drive at least partially like a human.

A fine line, I guess, that will be constantly finessed.

But if a human had done what this Waymo did, I don't think it would have been newsworthy.
 
If Waymo is starting to switch over to ML/AI and replace rule based code while you get smother rides doesn't that meant less predictability since it won't follow strict rules as much? Sorta the way X12 does now.

So maybe some of Waymo current problems is a conflict between the newer ML/AI code and the legacy rule based code.
 
The video doesn't give the whole story - we don't really know why the Waymo initially changed its mind - maybe it got stuck in the intersection because of traffic ahead of it, and decided to find a safe solution rather than blocking the intersection.
You can see in the video that the intersection had no traffic in the path of the Waymo.
1715259308664.png
 
  • Like
Reactions: Max Spaghetti
Right - thats the issue with "zero disengagement" drives on FSD too. If it does something obviously not right - but if you don't disengage - is it really a "zero disengagement" drive ?

Yeah, I've been arguing this for a bit.

I could have FSD hit a hive of Africanized bees, and kill 10 puppies but if I didn't disengage FSD it doesn't matter for stats. Same with Waymo and Cruise for all of these incidents.
 
It looks like Waymo is programmed to be aggressive, impatient, and wrong.

It wanted to turn left but was blocked on its front by cars waiting in the median. Instead of waiting for those to clear out, ít went ahead and turned left into the wrong way lane and could have had a head on collision if there were high speed inattentive right way drivers in that lane;

 
It looks like Waymo is programmed to be aggressive, impatient, and wrong.

It wanted to turn left but was blocked on its front by cars waiting in the median. Instead of waiting for those to clear out, ít went ahead and turned left into the wrong way lane and could have had a head on collision if there were high speed inattentive right way drivers in that lane;

Not sure if impatient is fair. They’re driving in LA, they could literally be waiting there for hours. I’m wondering how far they drove the wrong direction and whether it knew no one was coming. Technically you’re always driving against traffic when making a turn like that.
 
  • Disagree
Reactions: EVNow
Not sure if impatient is fair. They’re driving in LA, they could literally be waiting there for hours. I’m wondering how far they drove the wrong direction and whether it knew no one was coming. Technically you’re always driving against traffic when making a turn like that.
1. It's not LA, it's Tempe, AZ.
2. Then it should turn right and reroute to the destination.

This incident is not a good example for robots.
 
Yeah, I've been arguing this for a bit.

I could have FSD hit a hive of Africanized bees, and kill 10 puppies but if I didn't disengage FSD it doesn't matter for stats. Same with Waymo and Cruise for all of these incidents.
Without getting into such extremes - the basic point is, was the drive "flawless", not just "zero disengagement". Like that example above of Waymo driving the wrong way or FSD driving on the broad shoulder thinking its a valid lane.
 
Last edited:
  • Like
Reactions: flutas
Waymo has to report when they hit dogs (or anything else).
Never said they didn't, I was just giving extreme examples of failures FSD could have that still wouldn't count as a disengagement.

Hence why I said "same with these incidents." I was saying that all of these incidences of unlawful driving (wrong way, illegal lane, illegal turn) don't have to be reported and thus don't matter at all in the true "grand scheme of things" when they absolutely should.

It would be like a drivers license test passing you if you broke all the laws, so long as you don't hit anyone else.
 
  • Like
Reactions: Ben W
Never said they didn't, I was just giving extreme examples of failures FSD could have that still wouldn't count as a disengagement.

Hence why I said "same with these incidents." I was saying that all of these incidences of unlawful driving (wrong way, illegal lane, illegal turn) don't have to be reported and thus don't matter at all in the true "grand scheme of things" when they absolutely should.

It would be like a drivers license test passing you if you broke all the laws, so long as you don't hit anyone else.
Disengagement reports are useful in evaluating these systems. If a safety personnel was in the driver's seat, they would have rightly disengaged the system when it makes those mistakes so they will count as disengagements.

We are talking about L4 systems without safety personnel in the loop. They have to be able to recover from those mistakes if at all possible so it wouldn't count as a disengagement but I'm sure there are lots of mechanisms in place to monitor the performance of these systems and reports mistakes and bad driving decisions.
 
Never said they didn't, I was just giving extreme examples of failures FSD could have that still wouldn't count as a disengagement.

Hence why I said "same with these incidents." I was saying that all of these incidences of unlawful driving (wrong way, illegal lane, illegal turn) don't have to be reported and thus don't matter at all in the true "grand scheme of things" when they absolutely should.

It would be like a drivers license test passing you if you broke all the laws, so long as you don't hit anyone else.
I don’t think they matter much. Being much safer than the average human driver is what matters to me.
I’m sure there are many AV designs that could pass drivers tests but it doesn’t mean much because those tests are designed for humans. The only known way to test AVs is by brute force.
 
I don’t think they matter much. Being much safer than the average human driver is what matters to me.
I’m sure there are many AV designs that could pass drivers tests but it doesn’t mean much because those tests are designed for humans. The only known way to test AVs is by brute force.
Well, drivers tests are designed to test the drivers understanding of road rules and laws, something I would think applies to AVs as well...

But tell me this, what are we actually testing here with AVs?

Currently it's only "can the car drive without a collision." Nothing else.

I'm sure Waymo et al have internal stats...but that's a practice test and it's never actually "graded by the teacher."

**WARNING THE STATEMENT BELOW CONTAINS A HYPERBOLIC SITUATION TO EXEMPLIFY THE ISSUE**

With the current "test" a Waymo could literally drive the wrong way down the freeway, causing 500 collisions among motorists trying to get out of it's way and still never have a single mark on the test saying it did anything wrong.