Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
I see three possibilities here:

1. Waymo has gone NN-crazy with few or no guardrails and this was simply a NN hallucination
The least likely, IMHO. Waymo/Google are NN leaders and have long done E2E work, but they understand the limitations and have even pointed some out publicly. I don't see them adopting such a flaky approach in deployed cars.

2. Waymo asked Fleet Response "can I proceed through this object?" and FR mistakenly said yes
Also unlikely. As @Daniel in SD points out, FR may be able to tell the car to proceed over a trash bag or through a low hanging branch, but ramming a giant pole could only happen with a terrible system design AND a grossly incompetent (or malevolent) remote monitor.

3. A bug in their heuristic guardrail code caused it to ignore a clearly detected object
Such a bug could be triggered by FR, just as a video game player might trigger a "wall" bug by running parallel to the wall and turning suddenly or jumping just before hitting the wall or whatever. We'll never know unless Waymo tells us. Which they should. I don't expect driving perfection, but I do expect "don't run straight into huge poles" perfection. This is a "Day 1" issue, a Ten Commandments violation. Such a basic failure goes straight to their core and shakes my confidence in their overall system design.

This is why Waymo needs to be more transparent and explain these incidents. Without an explanation from Waymo, the public will inevitably speculate which will lead to false theories, rumors etc... And public trust will be shaken as people will wonder if the tech is really reliable or not.
 
Great example. But some early video games had bugs which in some cases let you partly go through a wall or see through a wall or whatever.
Hence why I said (and you quoted) "typically."

Not sure if you read everything I said, but your scenario 3 is quite literally what I said.
Ask yourself (especially with context of everything we've seen from Waymo as of late) what's more likely, the car itself can run a red light in an obscure situation as a bug, or Waymo is risking their reputation, company and for execs potential jail time by lying to the public (and investors - which could get them for wire fraud).

Wonder how long before we see a new model or manufacturer, since those cars will be subject to a 102% tax shortly.
 
Waymo ALERT in ATL. Odd that when they first started Mapping/testing/joy ridding or what ever I was seeing them several times a week. Then about 3 weeks or so they disappeared so I thought they may have went back to Cali. Well they are either back or never left.

IMG_5036.jpeg
 
  • Like
Reactions: diplomat33

US safety probe into Waymo self-driving vehicles finds more incidents​

In a letter to Waymo released Friday, NHTSA said it has learned of 9 additional similar incidents.

The agency said several incidents under investigation "involved collisions with clearly visible objects that a competent driver would be expected to avoid."

NHTSA said "reports include collisions with stationary and semi-stationary objects such as gates and chains, collisions
with parked vehicles, and instances in which the (automated driving system) appeared to disobey traffic safety control devices or rules."

(...) Talking about waymo not refuting claims, instead boasting of safety (...)

NHTSA said it is concerned that Waymo self-driving vehicles "exhibiting such unexpected driving behaviors may increase the risk of crash, property damage, and injury" and added that a number of incidents occurred "in the proximity of other road users, including pedestrians."

Link to the letter: https://static.nhtsa.gov/odi/inv/2024/INIM-PE24016-13047.pdf

Interestingly it sourced:

  • Reddit for the car driving on the wrong side of the road in SF.
  • Twitter for the car driving the wrong way after the failed left turn.
  • Twitter for the car aborting the mid-intersection turn.
  • Twitter (with an X url lol) for the car driving into the construction zone.
  • Youtube for the car looping in a parking lot.
  • Youtube for the car driving in the bus only MUNI lane (and illegal left turn).
  • Reddit for the cars blocking the freeway on-ramp.
  • Instagram for the car swerving while following the tree trailer.
  • Reddit for the car cutting someone off (no proof in this one, though they provided time/date/location).
  • Youtube for the car pulling out into a travel lane blocking it and almost hitting the bus trying to go around it. (I hadn't seen this one before.)
  • Youtube for the recent telephone pole collision.
and demanded Waymo provide them

  • VIN
  • Complete System Config (including HW and SW versions),
  • "a brief explanation of each Waymo ADS’s decision-making that led to a collision or a potential traffic safety law violation" (this will be interesting if it sees the light of day)
  • ALL video 30 seconds before the incident to the conclusion of the incident (first line crashes only [a], second line all incidents \[b\])
  • If remote assistance was involved in any way (including just monitoring), and all video of it.
  • "a composite rendering for the same timeframe specified in requests 2a or 2b (as applicable) showing video of each crash partner and each involved traffic control device alongside synchronized renderings of the ADS planned paths, the predicted trajectories of relevant road users, the velocity of the Waymo vehicle and other relevant road users, and the acceleration of the Waymo vehicle."
  • (If they don't have visibility into the AVs decision making process mentioned above) "supplement the video submission with additional composite views showing additional perception, planning, or other elements that influenced the ADS decision-making"
 
Last edited:



Link to the letter: https://static.nhtsa.gov/odi/inv/2024/INIM-PE24016-13047.pdf

Interestingly it sourced:

  • Reddit for the car driving on the wrong side of the road in SF.
  • Twitter for the car driving the wrong way after the failed left turn.
  • Twitter for the car aborting the mid-intersection turn.
  • Twitter (with an X url lol) for the car driving into the construction zone.
  • Youtube for the car looping in a parking lot.
  • Youtube for the car driving in the bus only MUNI lane (and illegal left turn).
  • Reddit for the cars blocking the freeway on-ramp.
  • Instagram for the car swerving while following the tree trailer.
  • Reddit for the car cutting someone off (no proof in this one, though they provided time/date/location).
  • Youtube for the car pulling out into a travel lane blocking it and almost hitting the bus trying to go around it. (I hadn't seen this one before.)
  • Youtube for the recent telephone pole collision.
and demanded Waymo provide them

  • VIN
  • Complete System Config (including HW and SW versions),
  • "a brief explanation of each Waymo ADS’s decision-making that led to a collision or a potential traffic safety law violation" (this will be interesting if it sees the light of day)
  • ALL video 30 seconds before the incident to the conclusion of the incident (first line crashes only [a], second line all incidents \[b\])
  • If remote assistance was involved in any way (including just monitoring), and all video of it.
  • "a composite rendering for the same timeframe specified in requests 2a or 2b (as applicable) showing video of each crash partner and each involved traffic control device alongside synchronized renderings of the ADS planned paths, the predicted trajectories of relevant road users, the velocity of the Waymo vehicle and other relevant road users, and the acceleration of the Waymo vehicle."
  • (If they don't have visibility into the AVs decision making process mentioned above) "supplement the video submission with additional composite views showing additional perception, planning, or other elements that influenced the ADS decision-making"

It is completely fair for NHTSA to investigate these incidents, especially since some of them do raise safety concerns. NHTSA is just doing their job. I welcome the accountability.

But I would point out that the incidents span several months. In fact, some of the other incidents that NHTSA said they were investigating are from a couple years ago. So some of these incidents are from older software versions that may have been solved by now. It would be like NHTSA going back to investigate incidents on FSD V10 when the issues have been solved on V12. If so, Waymo should be able to show that the issues have been solved in the most up-to-date software. I hope that is factored into NHTSA's investigation.

Having said, I look forward to seeing NHTSA's conclusions. Hopefully, we learn some good stuff and it helps make Waymo's autonomous driving even safer.
 
It is completely fair for NHTSA to investigate these incidents, especially since some of them do raise safety concerns. NHTSA is just doing their job. I welcome the accountability.

But I would point out that the incidents span several months. In fact, some of the other incidents that NHTSA said they were investigating are from a couple years ago. So some of these incidents are from older software versions that may have been solved by now. It would be like NHTSA going back to investigate incidents on FSD V10 when the issues have been solved on V12. If so, Waymo should be able to show that the issues have been solved in the most up-to-date software. I hope that is factored into NHTSA's investigation.

Having said, I look forward to seeing NHTSA's conclusions. Hopefully, we learn some good stuff and it helps make Waymo's autonomous driving even safer.
This may yet be another example of how regulatory bodies move too slow for technology. EVs, and even some ICE cars, are now updating software OTA for enhanced features, fixes for capability issues, etc. NHTSA and other agencies were built with the old ICE model of slow development, and slow rollout of new models. It was expected that Toyota would release a new Corolla once a year, with a refresh every 3, and a "redesign" every 5 or so. NHTSA has months to investigate before Toyota released a new model, which possibly fixed the issue, and come up with a plan to fix the older models. With modern EVs, and some ICEs, they've been fixed before the investigation is even finished.

I've laughed a few times when getting a recall letter for my 21MY which was fixed months previously.
 
  • Like
Reactions: diplomat33
Browsing the CA DMV reports, Waymo hit a raccoon on the freeway. Of course the driver disengaged "shortly" before impact, thus it's not the Waymo Driver at fault.

Wonder how long "shortly" is, feels like something that should be stated in specific terms (5s, 30s) rather than vague terminology.

On May 8, 2024 at 11:16 PM PT a Waymo Autonomous Vehicle (“Waymo AV”) operating in San Francisco, California was in a collision involving a wild animal (raccoon) on Interstate 280 Junipero Serra Freeway between exits 49B and 51.

The Waymo AV was traveling eastbound on Interstate 280 in autonomous mode when a raccoon ran into the roadway from the left side of the highway in front of the Waymo AV. The Waymo AV test driver transitioned to manual mode shortly before the front driver’s side of the Waymo AV made contact with the raccoon. At the time of the impact, the Waymo AV’s Level 4 ADS was not engaged and a test driver was operating the Waymo AV in manual mode. The Waymo AV sustained damage.
 
So crossing guards that report that Waymos are nearly hitting them is very hyperbole and is probably more that the Wamoys don't respond the same as human drivers do and seem more "unpredictable". Plus these guards tend to be older and less tech savvy and many likely have a natural fear bias to high tech and seeing a car with NO driver is sorta shocking and scary to them.
I suspect that is partly true, but also a big part of it is the lack of eye contact.

I think when most pedestrians venture out onto a road expecting vehicles to give way to them, they establish eye contact with the driver which gives them some confidence the driver is aware of their presence and will not hit them. If they notice the driver looking the other way or texting, they will assume the driver is not going to give way and take evasive action themselves.

That is not possible with a Waymo, so the pedestrian has no way of confirming that the Waymo can see them and is giving way. So yes, the Waymo "nearly hit them" is just as true in their perspective as the Waymo "didn't hit them because it saw them and stopped short of them, but they didn't know that until it actually stopped, so they got scared because they thought the Waymo might hit them" - it's just a matter of confidence.

A solution could be better communications from the car via some form of audio/visual system such as additional lights or sounds. For example, a traditional car has brake lights at the rear to let following traffic know it is slowing/stopping. But it doesn't have brake lights at the front to let a pedestrian know it is slowing/stopping (or isn't!). Maybe some form of visual braking indication on the front of the car would help - preferably standardised so all AV's implement the same system.
 
I suspect that is partly true, but also a big part of it is the lack of eye contact.

I think when most pedestrians venture out onto a road expecting vehicles to give way to them, they establish eye contact with the driver which gives them some confidence the driver is aware of their presence and will not hit them. If they notice the driver looking the other way or texting, they will assume the driver is not going to give way and take evasive action themselves.

That is not possible with a Waymo, so the pedestrian has no way of confirming that the Waymo can see them and is giving way. So yes, the Waymo "nearly hit them" is just as true in their perspective as the Waymo "didn't hit them because it saw them and stopped short of them, but they didn't know that until it actually stopped, so they got scared because they thought the Waymo might hit them" - it's just a matter of confidence.

A solution could be better communications from the car via some form of audio/visual system such as additional lights or sounds. For example, a traditional car has brake lights at the rear to let following traffic know it is slowing/stopping. But it doesn't have brake lights at the front to let a pedestrian know it is slowing/stopping (or isn't!). Maybe some form of visual braking indication on the front of the car would help - preferably standardised so all AV's implement the same system.
Waymo already has a "stopping for pedestrian" indicator, it shows the following image (also in the front):
Waymo_DomeComms_YieldingToPed_Back.png


The bus situation (where it cut off the bus) is tougher. In that case, lack of eye contact probably played a role. The bus driver had no way to tell the intention of the AV, so he wasn't sure if it was going or not, and thus the bus driver hesitated.
 
Last edited:
Browsing the CA DMV reports, Waymo hit a raccoon on the freeway. Of course the driver disengaged "shortly" before impact, thus it's not the Waymo Driver at fault.

Wonder how long "shortly" is, feels like something that should be stated in specific terms (5s, 30s) rather than vague terminology.
Waymo is liable when their test drivers crash too (same as any other company employing drivers.)
I think you can assume the Waymo Driver would have had a collision whenever they say “shortly”. The paper where they looked at all the collisions they had in Phoenix didn’t describe any cases where the safety driver disengaged and had a collision the system would have avoided. There were many cases where the safety driver disengaged and avoided a collision the Waymo Driver would have had.
 
Another day, another "what the ****" from Waymo is dug up.

This time failing a left turn on a clear street (very similar behavior as the Phoenix incident the NHTSA is investigating) then ramming itself through pedestrians already in the crosswalk.


and since the footage is terrible, here's someone who pulled screenshots from it.




And another one from everyone's favorite shill.

This time with a Waymo ignoring a keep clear zone and blocking a bus until (presumably remote ops saves it) the car starts completing the weirdest maneuver I have ever seen to get out of the way.


Looks like the corner of 16th and Church in SF. The stance of the car at the start makes me wonder if the lot nearby is a staging lot, and the cars pulling out of it don't have enough room to do it without encroaching on the staggered keep clear zone for this exact situation.

1716652463244.png
 
Last edited:
Are you saying Alphabet should make the decision and just tell Waymo to do it?
Google is one crazy a$$ company when it comes to decisions on its products and divisions. Licensing could limit Waymo (the division's) growth but be more profitable for Google. If Waymo was spun off with access to enough capital they would probably like to expand to more markets and grow the compony over licensing to other RT completion. Likely they would be more interested in only licensing to other manufactures for consumer only cars that exclude RTs. But just a bunch of hypothetical babble on my part. 🤣
 
Google is one crazy a$$ company when it comes to decisions on its products and divisions. Licensing could limit Waymo (the division's) growth but be more profitable for Google. If Waymo was spun off with access to enough capital they would probably like to expand to more markets and grow the compony over licensing to other RT completion. Likely they would be more interested in only licensing to other manufactures for consumer only cars that exclude RTs. But just a bunch of hypothetical babble on my part. 🤣
Waymo LLC has other owners (about 7.5%) than Alphabet, and Waymo fully owns the IP to the Waymo Driver, so Waymo LLC is the only company that can sell/grant a license for using the Waymo Driver.