Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
It is not too hard to calculate future positions of vehicles when you know their exact speed and trajectory. The Waymo stack does this every millisecond with high accuracy.
Every millisecond? Are you sure? I think their lidar is 20 Hz, i.e. 50 ms. Video is usually 24-60 fpg, so 16-41 ms. I don't know about radar, but overall I doubt they recalculate more often than once every 20-30 ms. Note that 30 ms is about one meter on the highway, only about a foot in city traffic.

In the video, we see the Waymo start to creep forward and then stop. I think the Waymo initially calculated the future positions accurately of the vehicles and determined it was safe to make the turn. So the planner told the car to proceed but the Waymo moved too slowly. This was a case where if the Waymo had "gunned it" immediately, I think it could have made the turn safely but it hesitated a fraction of second and that was enough for the turn not to be safe anymore. The Waymo recalculated the future positions of the vehicles accurately and determined that it was no longer safe so the planner told the car to stop.
Better than any theory I have, but that'd be a real screwup. Waymo should predict ego speed and location with near-perfect accuracy vs. oncoming vehicle predictions which necessarily come with wide error bands.

"Waymo told the Chronicle in a statement that the robotaxi “detected that there may be a risk of a person within that crowd who had fallen down, and decided to carefully initiate a passing maneuver when the opposing lane was clear to move around what could be an obstacle and a safety concern.”
Interesting. Seems the best response to a possible fallen rider would be to slow and lengthen the following distance. Unis don't stop that well so I'd expect the others to swerve around their fallen comrade. Of course I doubt they have much pack-of-unicycle training data, so a little confusion is understandable. Even so, I'd still expect confusion to result in slowing or stopping vs. 40 seconds of blatantly illegal driving into oncoming traffic.
 
  • Like
Reactions: flutas and MP3Mike
Saw a little hint to look through Waymo's accident reports so far this year.

Many of the same old with them getting rear ended, but a few stood out.

Waymo drives into a gate.
On February 5, 2024 at 10:01 PM PT a Waymo Autonomous Vehicle (“Waymo AV”) operating in Los Angeles, California was in a collision involving a gate at Watt Way and West Exposition Boulevard.

The Waymo AV was traveling south on Watt when it approached an automatic gate as it had started to close. The right side of the Waymo AV made contact with the door of the gate. At the time of the impact, the Waymo AV’s Level 4 ADS was engaged in autonomous mode. The Waymo AV sustained damage.
https://www.dmv.ca.gov/portal/file/waymo_02052024-pdf/

Waymo hits a parked car
On March 28, 2024 at 7:30 AM PT a Waymo Autonomous Vehicle (“Waymo AV”) operating in Los Angeles, California was in a collision involving a passenger car on Louisiana Avenue at Kerwood Avenue.

While making a tight left turn from southbound Kerwood Avenue onto northeastbound Louisiana Avenue, the Waymo AV made contact with the driver side of a parked passenger car. At the time of the impact, the Waymo AV’s Level 4 ADS was engaged in autonomous mode, and a test driver was present (in the driver’s seating position). Both vehicles sustained damage.
https://www.dmv.ca.gov/portal/file/waymo_032824-pdf/


The big thing is, what I was looking for isn't there.
Waymo caused an accident with a VRU by running a red light and didn't report it.


In January, an incident took place where a Waymo robotaxi incorrectly went through a red light due to an incorrect command from a remote operator, as reported by Waymo. A moped started coming through the intersection. The moped driver, presumably reacting to the Waymo, lost control, fell and slid, but did not hit the Waymo and there are no reports of injuries. There may have been minor damage to the moped.

Waymo Runs A Red Light And The Difference Between Humans And Robots

Relevant law:
§ 227.48. Reporting Collisions.

A manufacturer whose autonomous vehicle while operating under a Manufacturer's Testing Permit or a
Manufacturer's Testing Permit - Driverless Vehicles is in any manner involved in a collision originating
from the operation of the autonomous vehicle on a public road that resulted in the damage of property
or in bodily injury or death shall report the collision to the department
, within 10 days after the collision,
on Report of Traffic Collision Involving an Autonomous Vehicle, form OL 316 (Rev. 7/2020) which is
hereby incorporated by reference. The manufacturer shall identify on the form, by name and current
address, if available, all persons involved in the collision, and a full description of how the collision
occurred. Nothing in this section relieves any person from compliance with any other statutory and/or
regulatory collision reporting requirements.
 
  • Informative
Reactions: EVNow and MP3Mike
Looks like first Waymo Zeekr robotaxi spotted in SF.


Some characteristics of Waymo's Zeekr vehicle. There is a typo in text. Should be 500,000 km in lifetime.

GL5KqM2aIAAMEAo


 
  • Informative
  • Like
Reactions: flutas and Bitdepth
Waymo is great at ULTs. One bad example does not disprove that. Waymo probably does thousands of ULTs a week with no issue. I could easily show 10 examples of Waymo doing a great ULT.
There are hundreds of examples of FSD taking tuns without curbing.

That is the problem with autonomous cars - you have to be able to drive 100,000 miles before a serious accident.

So its not about the times you get it right, but about the times you get it wrong.
 
I've seen more examples of Waymo bad driving recently than ever before. Part of this is because they're driving more miles, but I also think they dialed up the aggressiveness. Maybe too much.
Yes ... like this one I posted earlier. One Waymo cut the other one.

Guess which Waymo here is not in Grandma mode ? What would @Bladerskb say if this was a Tesla ?

 
  • Funny
Reactions: primedive
That is the problem with autonomous cars - you have to be able to drive 100,000 miles before a serious accident.

So its not about the times you get it right, but about the times you get it wrong.

Exactly. So how often does Waymo get it wrong? How often does Waymo fail a ULT? You have given me one example of a fail. That does not prove that Waymo is not great at ULTs. Maybe Waymo only fails a ULT every 1M ULTs? Do you have evidence that Waymo fails ULTs a lot?

We know from Waymo's safety report, that Waymo has 0.41 accidents per 1M miles. So Waymo drives way more than 100k miles per serious accident.

Source: https://assets.ctfassets.net/e6t5di...man_Benchmarks_at_7_1_Million_Miles_arxiv.pdf

Yes ... like this one I posted earlier. One Waymo cut the other one.

That was 4 months ago.
 
Last edited: