Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
And Remote assistance gave the Waymo Driver bad information about how to handle the lane that was coned off. It's likely that with the right information, the Waymo Driver would have been fine. In any case, Waymo will learn from this edge case and make the Waymo Driver even better. Developing FSD is a learning process.

The problem with assessing Waymo is that we don't know when remote assistance is making a decision vs the software.

That's why I have trouble trusting many of Waymo's accomplishments, and why I think Tesla's fsd beta is so much more interesting to analyze.

For example, you previously showed a clip of a Waymo's path through double-parked cars in SF. How do we know that a remote driver didn't initiate that path? It's a guessing game, so the technological awe is diminished.

Edit: I want to thank JJRicks for publishing the video in all it's glory. The funniest part is that Waymo paid a voice actor to read their statement regarding the disengagement, lol.

Edit2: JJRicks' video is fascinating. It's basically proof that sensors aren't the limitation. The fact that companies keep adding on higher resolution and more sensors is a marketing play.

Edit3: if Waymo is getting hung up on simple cone placements, good luck in SF.
 
Last edited:
There was so much potential for injury or property damage in this situation. It's really fortunate nobody was hurt.

First, how many times did the support line think the vehicle had come to a complete stop when it was actually still driving? We know the vehicle wouldn't move if the safety driver stood in front of the vehicle, but would it be capable of moving while the safety driver was approaching? Could the safety driver be injured if they think the vehicle is immobilized while it's actually active?

Second, what would have happened if actual construction work was occurring on the right side of the cones? It was lucky construction was completed and the workers were in the process of picking up the cones. Imagine if the right side of the cones had hot asphalt or wet concrete; fully level with the road so sensors wouldn't be able to detect a difference, but catastrophic if the vehicle were to drive into it. Why did the vehicle decide to travel through the construction cones instead of sticking to one side?

Third, I completely disagree with Waymo that stopping dead in the middle of the road was a safe response to the situation. I thought their vehicles were capable of an appropriate dynamic driving task fallback? With the vehicle stopped dead in the middle of the road, it's just a matter of time until an inattentive driver rear-ends the Waymo. Not to mention that after the second time the vehicle started driving again, it rapidly pulls out into moving traffic and almost hits another vehicle (at 23:46).
 
I'm not sure how to reconcile this:



with this:


Your bar for "amazing FSD" must be much lower than mine. (Unless you meant amazingly bad. :rolleyes:)

Well, it's relative. I think FSD that only has 1 disengagement per 30,000 miles is amazing. Nobody is close to that. So yeah, that's pretty amazing in my view. But that does not mean that there is not room for improvement. I acknowledge that Waymo's FSD still needs to improve in some areas. It is not perfect. Something can be amazing and still have room to be even better.
 
  • Disagree
Reactions: mikes_fsd
The problem with assessing Waymo is that we don't know when remote assistance is making a decision vs the software.

That's why I have trouble trusting many of Waymo's accomplishments, and why I think Tesla's fsd beta is so much more interesting to analyze.

For example, you previously showed a clip of a Waymo's path through double-parked cars in SF. How do we know that a remote driver didn't initiate that path? It's a guessing game, so the technological awe is diminished.

Edit: I want to thank JJRicks for publishing the video in all it's glory. The funniest part is that Waymo paid a voice actor to read their statement regarding the disengagement, lol.

Edit2: JJRicks' video is fascinating. It's basically proof that sensors aren't the limitation. The fact that companies keep adding on higher resolution and more sensors is a marketing play.

Edit3: if Waymo is getting hung up on simple cone placements, good luck in SF.

I think we can. Waymo does not do tele-operation. Waymo only provides guidance if the car gets stuck. But the Waymo Driver still makes all the decisions. So yeah. there might be instances where the Waymo Driver needed some guidance but we can still judge what we see because it is the car doing all the driving tasks.

In the case of the Waymo navigating through double parked cars, there was no guidance from a remote driver. The remote driver did not initiate any path. But even if it had, we can still judge how the car executed the path because the car was doing all the driving. There was no tele-operation.
 
I acknowledge that Waymo's FSD still needs to improve in some areas. It is not perfect. Something can be amazing and still have room to be even better.

The car literally ran away as the human roadside assistance guy was gonna walk up to the car?

I don't understand why they can't stop the car? Seems dangerous and obvious

Edit: if this event gets more widely publicized, I'm afraid Waymo might have to shut down their driverless operations for a while to fix these obvious oversights.

Edit2: if this were a Tesla, you can bet there will be a gang of shorties calling the Chandler city hall right now
 
Last edited:
The car literally ran away as the human roadside assistance guy was gonna walk up to the car?

I don't understand how they can't stop the car? Seems dangerous and obvious

Oh I agree the remote assistance sucks. I think the whole idea of having road side assistance nearby in vans, ready to catch up to the Waymo if there is a problem, is a bad system.
 
Bottom line is that remote assistance screwed up. And yes, it was potentially a dangerous situation. Fortunately, there was no accident. I am sure Waymo will learn from this situation and both improve their remote assistance protocols as well as improve the Waymo Driver so that remote assistance is not needed.
 
Last edited:
The funniest part is that Waymo paid a voice actor to read their statement regarding the disengagement, lol.
Ah, small detail but I actually hired the voiceover guy since I'd been looking for an excuse to, and like his voice. Danny Harmon, been following his YouTube for the better part of a decade. Lovely guy. But yeah anyway, definitely a weird video; lots to analyze
 
Ultimately, it would be good for regulators to mandate disaggregated disengagement statistics. 1 disengagement every 30,000 miles sounds great in abstraction, but what if that means 1 disengagement in every 55,000 miles of clear roadway plus 1 disengagement in every 5,000 miles of construction zones?

It's kinda the same problem with Tesla releasing their Autopilot safety statistics in the aggregate. It looks like Autopilot is safer than the average car, but we know that Autopilot is more likely to be engaged on the highway than average.
 
Edit2: JJRicks' video is fascinating. It's basically proof that sensors aren't the limitation. The fact that companies keep adding on higher resolution and more sensors is a marketing play.

Adding higher resolution sensors is not a marketing play. It clearly gives the car better perception. Do you really think that lower resolution is better? And we still don't know if Tesla's camera sensors are good enough for L4 or L5. But perception is not everything. Even with excellent perception, the car still needs good planning. The issue we see in the video has nothing to do with bad perception. The Waymo detected all the cones perfectly. The problem was the planner that was not sure what lane was open.

Actually lots of areas. Infact, it doesn't work at all in most of the world.

Waymo is only in the US.
 
  • Disagree
Reactions: mikes_fsd
Adding higher resolution sensors is not a marketing play. It clearly gives the car better perception.
Clearly you have no idea WTF you're talking about.
Perception: the ability to see, hear, or become aware of something through the senses.

Higher resolution sensor doesn't mean squat if you do not know what to do with standard resolution data! :rolleyes: 🤦‍♂️
 
Last edited:
Ah, small detail but I actually hired the voiceover guy since I'd been looking for an excuse to, and like his voice. Danny Harmon, been following his YouTube for the better part of a decade. Lovely guy. But yeah anyway, definitely a weird video; lots to analyze

Wow, that's even weirder than Waymo paying for the guy! Lol

I love you man. It was fun to see you so excited to get the car stuck. You're doing great work.
 
Ah, small detail but I actually hired the voiceover guy since I'd been looking for an excuse to, and like his voice. Danny Harmon, been following his YouTube for the better part of a decade. Lovely guy. But yeah anyway, definitely a weird video; lots to analyze

Thanks for your video.

Would you mind sharing how many disengagements you've experienced? I think in your video you mentioned 2-3 prior to this one?

And do you have an estimate for how many miles you've travelled?

If this is your 4th disengagement and you haven't travelled 120,000 miles with Waymo, I think it's fair to say that you pick more challenging routes than the average rider!
 
Higher resolution censor doesn't mean squat if you do not know what to do with standard resolution data! :rolleyes: 🤦‍♂️

True. That would explain why Tesla does not use higher resolution sensors.

In the case of Waymo, they know what to do with standard resolution so they can handle higher resolution so that the cars can see even better.
 
  • Disagree
Reactions: mikes_fsd
@willow_hiller All of that and more extensively documented on Video Archive (Last ~4 rides aren't in the spreadsheet yet due to forgetting/laziness, will fix soon) Out of all the disengagements I've seen, not one has been the direct result of a safety threat. (Maybe this last one is edging closer to that line though) All have been weird logic quirks or the safety driver being a bit too proactive. (In my opinion, at least) Most are documented on video (oops apologies misplaced my @ for a sec there)
 
@willow_hiller All of that and more extensively documented on Video Archive (Last ~4 rides aren't in the spreadsheet yet due to forgetting/laziness, will fix soon) Out of all the disengagements I've seen, not one has been the direct result of a safety threat. (Maybe this last one is edging closer to that line though) All have been weird logic quirks or the safety driver being a bit too proactive. (In my opinion, at least) Most are documented on video (oops apologies misplaced my @ for a sec there)

Cool website, that's some really detailed records you've been keeping.

It is interesting that you've personally experienced 4 disengagements over about 1,100 miles of rides.

Would you say you tend to pick particularly difficult routes? Or is it varied?