Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Well, drivers tests are designed to test the drivers understanding of road rules and laws, something I would think applies to AVs as well...

But tell me this, what are we actually testing here with AVs?

Currently it's only "can the car drive without a collision." Nothing else.

I'm sure Waymo et al have internal stats...but that's a practice test and it's never actually "graded by the teacher."

**WARNING THE STATEMENT BELOW CONTAINS A HYPERBOLIC SITUATION TO EXEMPLIFY THE ISSUE**

With the current "test" a Waymo could literally drive the wrong way down the freeway, causing 500 collisions among motorists trying to get out of it's way and still never have a single mark on the test saying it did anything wrong.
A drivers test is only 20 minutes. I suspect the vast majority of people would fail if the drivers test was the worst 20 minutes of their driving history.
Obviously we should include causing third party collisions when measuring safety. I would hope that Waymo is required to save data so that it can be investigated if it’s suspected of causing collisions.
What should be the AV standard be for law breaking?
It seems to me that actual collisions are a much better indicator of safety since the sample size is so much larger than you would have for an individual human driver.
 
  • Like
Reactions: Ben W
Which will be kept extremely secret by Waymo and never published ;)
You're in this very thread talking about it, there is no secret. These systems are being tested on public roads so none of these companies who are testing L4 systems on public road can hide how bad their systems are. Uber couldn't hide, Cruise couldn't hide it and Waymo who has been doing this far longer than anyone certainly can't hide it.

If you want to hide how bad your system drives, claim it is L2 so you don't need to take blame or report how bad your system is. That has changed now as all L2 systems still have to report their data.
 
You're in this very thread talking about it, there is no secret. These systems are being tested on public roads so none of these companies who are testing L4 systems on public road can hide how bad their systems are. Uber couldn't hide, Cruise couldn't hide it and Waymo who has been doing this far longer than anyone certainly can't hide it.

If you want to hide how bad your system drives, claim it is L2 so you don't need to take blame or report how bad your system is. That has changed now as all L2 systems still have to report their data.
He's not talking about anecdotal incidents reported by the media or public (which is likely a very small subset of the total), but rather the data reports (if they exist within Waymo) on incidents where the car broke the law. Under current regulations, Waymo is not required to report it. They only have to report disengagements. Heck, under current California law, Waymo can't even be ticketed for them, while a driver can.
 
  • Like
Reactions: flutas
He's not talking about anecdotal incidents reported by the media or public (which is likely a very small subset of the total), but rather the data reports (if they exist within Waymo) on incidents where the car broke the law. Under current regulations, Waymo is not required to report it. They only have to report disengagements. Heck, under current California law, Waymo can't even be ticketed for them, while a driver can.
I’m fully aware what he’s talking about and I’m saying you cannot hide how bad your system performs on public roads. It is literally impossible and has been proven time and time again. Disengagement just means the system was disengaged. It could be for any given reason.

We’ve had people on this very forum who used waymo for years and meticulously documented every trip and driving in the incoming lane was not a common occurrence so I’m inclined to speculate is a regression of some kind due to recent changes in their system.
 
  • Like
Reactions: nativewolf
With the current "test" a Waymo could literally drive the wrong way down the freeway, causing 500 collisions among motorists trying to get out of it's way and still never have a single mark on the test saying it did anything wrong.
I don't think so. In January a Waymo ran a red light due to bad remote assistant instructions. The Waymo then detected a crossing moped and stopped, but the moped fell over anyway presumably attempting an evasive maneuver. The two vehicles never came in contact and the moped left the scene, but Waymo still reported it.
 
I don't think so. In January a Waymo ran a red light due to bad remote assistant instructions. The Waymo then detected a crossing moped and stopped, but the moped fell over anyway presumably attempting an evasive maneuver. The two vehicles never came in contact and the moped left the scene, but Waymo still reported it.
Where?

I actually asked this back in this thread, as it's not reported in the CA vehicle collision dataset.

Saw a little hint to look through Waymo's accident reports so far this year.

(...)

The big thing is, what I was looking for isn't there.
Waymo caused an accident with a VRU by running a red light and didn't report it.




Waymo Runs A Red Light And The Difference Between Humans And Robots

EDIT:

After digging more on this tonight, I managed to find the report in the NHTSA dataset, the accident happened in San Francisco and wasn't reported to the CA DMV. The NHTSA dataset also doesn't state that the accident was reported to the CA DMV as required by law.

This report includes the following updates to the initial report submitted on January 14, 2024 [REDACTED, MAY CONTAIN CONFIDENTIAL BUSINESS INFORMATION]: corrected event year in the narrative from 2023 to 2024 LOCATION ADDRESS / DESCRIPTION field updated to reflect the location listed in the narrative clarifications added to the narrative and correction to the Waymo AVs mileage. The remainder is unchanged from the initial report.
On January [XXX], 2024 at 10:52AM PT a rider of a moped lost control of the moped they were operating and fell and slid in front of a Waymo Autonomous Vehicle (Waymo AV) operating in San Francisco, California on [XXX] at [XXX] neither the moped nor its driver made contact with the Waymo AV.

The Waymo AV was stopped on northbound [XXX] at the intersection with [XXX] when it started to proceed forward while facing a red traffic light. As the Waymo AV entered the intersection, it detected a moped traveling on eastbound [XXX] and braked. As the Waymo AV braked to a stop, the rider of the moped braked then fell on the wet roadway before sliding to a stop in front of the stationary Waymo AV. There was no contact between the moped or its rider and the Waymo AV. The Waymo AVs Level 4 ADS was engaged in autonomous mode.
Waymo is reporting this crash under Request No. 1 of Standing General Order 2021-01 because a passenger of the Waymo AV reported that the moped may have been damaged. Waymo may supplement or correct its reporting with additional information as it may become available.

Waymo has 2 accident reports listed for January 2024. January 8th and January 31st. Neither of which involve a moped.
 
Last edited:
Where?

I actually asked this back in this thread, as it's not reported in the CA vehicle collision dataset.



EDIT:

After digging more on this tonight, I managed to find the report in the NHTSA dataset, the accident happened in San Francisco and wasn't reported to the CA DMV. The NHTSA dataset also doesn't state that the accident was reported to the CA DMV as required by law.



Waymo has 2 accident reports listed for January 2024. January 8th and January 31st. Neither of which involve a moped.
Yep,"in any manner", they definitely should have reported it according to the law.
1715314024912.png


Looks like there is a proposal to make traffic violations reportable.

Commencing July 31, 2025, this bill would require a manufacturer of autonomous vehicles to report to the DMV a vehicle collision, traffic violation, or disengagement, as defined, or a barrier to access or incident of discrimination for a passenger with a disability, that involves a manufacturers vehicle in California regardless of whether the vehicle is in the testing or deployment phase. The bill would require these reports to contain specified information and to be submitted at the time the incident is identified by the manufacturer. The bill would require these reports to be submitted on a timeline adopted by the DMV that does not exceed reporting deadlines required by the federal National Highway Traffic Safety Administration. The bill would additionally require a manufacturer to submit quarterly reports to the department that summarize the above-mentioned reports, vehicle miles traveled, unplanned stops, and wheelchair-accessible services, as specified. The bill would require the DMV, in consultation with the Department of the California Highway Patrol, the Public Utilities Commission, and any other public entity it deems necessary, to create and publish an autonomous vehicle incident form and a form to report data required by these provisions by no later than July 1, 2025. The bill would require the DMV to publish all reports submitted pursuant to these provisions in an electronic, open, and machine-readable format on the departments internet website within 30 days of receipt, as specified. The bill would authorize the DMV to impose specified fines for violations of the reporting provisions and to suspend or revoke the testing and deployment permit of any manufacturer while an investigation of any violations is pending. The bill would also authorize members of the public or public entities with direct evidence of an incident to submit an autonomous vehicle incident report, as specified.

Frankly it seems just as likely the FSD is causing a bunch of unreported collisions.
 
EDIT:

After digging more on this tonight, I managed to find the report in the NHTSA dataset, the accident happened in San Francisco and wasn't reported to the CA DMV. The NHTSA dataset also doesn't state that the accident was reported to the CA DMV as required by law.
They only report to DMV when operating under their testing permit. Their pay-per-ride service operates under their CPUC deployment permit.
 
They only report to DMV when operating under their testing permit. Their pay-per-ride service operates under their CPUC deployment permit.

Their pay-per-ride service operates under both the CA DMV and the CPUC permit.

The CPUC even said this when issuing their permits.

Additionally, both Cruise and Waymo possess an Autonomous Vehicle Deployment Program Permit issued by the California Department of Motor Vehicles (DMV). This DMV permit is a prerequisite for AV deployment and is distinct from the CPUC’s permit, which is an additional requirement for companies that provide transportation services to the public using AVs. Participants in the CPUC’s AV programs must also maintain the relevant DMV AV permit in good standing.
 
AFAIK, in CA, when testing with a safety driver, AV companies have to report disengagements and collisions. When testing without a safety driver, AV companies only have to report collisions. When deployed driverless in a commercial service, they only have to report collisions.
 
Their pay-per-ride service operates under both the CA DMV and the CPUC permit.
But it's a DMV deployment permit, not a testing permit. Go to this DMV page and click on the links for Driverless Testing and Testing with a Safety Driver. Under the "Requirements" area of each you'll see rules for Collision Reporting. But click on the link for Autonomous Vehicle Deployment Program and you'll find no such requirement for collision reporting. I'm sure if you dig deep enough into the actual regulations you'll see where this is all spelled out. I found out about it by reading some article or blog post I didn't save, but found the DMV web page which backs it up.
 
  • Informative
Reactions: EVNow
on incidents where the car broke the law.
The problem with "broke the law" is, it happens all the time. Say 1 mph above speed limit is break the law. Crossing solid lines in breaking the law, even if it is to give safety space for bikes.

Is there some kind of gradation of laws where breaking some are more severe in traffic law ? Like the below would be really bad in my book
- Skipping red lights, stop signs
- Cutting off other vehicles aggressively
- Tailgating
...

I think AVs should never do these "class A" traffic violations, where as "class c" are ok if there is some valid reason.
 
  • Like
Reactions: Ben W