Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Is the claim you're making that Waymo makes 0 mistakes while driving in LA? I'd like to see the data to back up that claim.

Has Waymo ever run a stop sign in LA? If so, how many? If it did so without disengagement, I don't think that data would be collected or reported anywhere.
The data is out there unlike Tesla. You can take a look at it if you want.

Unlike Tesla, Waymo has to report every single disengagement in CA, the road type where the disengagement happened and a description of the cause of the disengagement.

Because you knew that, you resulted to "well it doesn't mean anything because if it did run a stop sign there would be no disengagement".
Now you are saying to safety drivers wouldn't take over or that it runs stop signs while driverless. Gotta love that mental gymnastics.

All Hail Tesla The Great, The most transparent!
Shame on Waymo the secretive!
 
The data is out there unlike Tesla. You can take a look at it if you want.

Both companies publish all the disengagement data they're legally required to publish. Tesla is level 2. They can't just publish the time, date, and location of customer vehicle disengagements; that would be a huge violation of privacy.

If you have data showing how often driverless Waymo have run stop signs, post it. But I'm willing to bet it won't exist, because they're not legally required to count how many stop signs it runs; they're only legally required to publish disengagements.

And if you're claiming that Waymo software is perfect and has never once made a mistake like running a stop sign, you're the only one doing mental gymnastics. There's no such thing as perfect software.
 
...
If you have data showing how often driverless Waymo have run stop signs, post it. But I'm willing to bet it won't exist, because they're not legally required to count how many stop signs it runs; they're only legally required to publish disengagements.
...

Agreed. However, if San Francisco police had ticketed Waymo vehicles for that behaviour, this would have been included as part of their complaint. Instead they basically said Waymo might be a problem because Cruise does have problems.

We can be fairly sure that Waymo vehicles do not regularly break the law in front of San Fran cops.
 
If you have data showing how often driverless Waymo have run stop signs, post it. But I'm willing to bet it won't exist, because they're not legally required to count how many stop signs it runs; they're only legally required to publish disengagements.
They are required to report everything.

IMAGRY
n3dOJAJ.png

AURORA
O5Wx4iF.png

PONY AI
J50vKBO.png

ZOOX
UPpE93H.png

And if you're claiming that Waymo software is perfect and has never once made a mistake like running a stop sign, you're the only one doing mental gymnastics. There's no such thing as perfect software.
Nobody is claiming Waymo is perfect, they just haven't had to report running stop signs because their software is robust. Their safety drivers haven't had to disengage because of stop sign issues.
 
We can be fairly sure that Waymo vehicles do not regularly break the law in front of San Fran cops.

Likewise, if FSD Beta ran a stop sign and the driver received a ticket for it, you can guarantee we'd see articles about it from the Verge, Business Insider, Bloomberg, et. al.

But at the end of the day, there's no hard data for either Waymo or Tesla on these types of failures, and we're left to compare anecdotes to anecdotes.

In theory, I'd guess they do it less often due to their HD maps, but in practice it's hard to give extract figures.
They are required to report everything.

Thanks for providing this. But that's only disengagements from vehicles with safety drivers. If a driverless vehicle runs a stop sign in such a way that it's unaware that it made a mistake, would it be reported?
 
Thanks for providing this. But that's only disengagements from vehicles with safety drivers. If a driverless vehicle runs a stop sign in such a way that it's unaware that it made a mistake, would it be reported?
If you look at the videos of people taking driverless Waymo in SF, it responds well to stop signs, obstructed and even temporary ones held by traffic control workers. You won't see a single waymo running a stop sign. Seriously look through Maya on youtube. Also, @JJRicks kept a meticulous record of all his Waymo rides in AZ and there is no report of a stop sign violation.

 
If you look at the videos of people taking driverless Waymo in SF, it responds well to stop signs, obstructed and even temporary ones held by traffic control workers. You won't see a single waymo running a stop sign. Seriously look through Maya on youtube. Also, @JJRicks kept a meticulous record of all his Waymo rides in AZ and there is no report of a stop sign violation.


I don't disagree that mistakes would be rare. But that leaves us with two possibilities. Either:

1. In over 20 million miles driven on public roads, Waymo has never once failed to stop at a stop sign. Including cases where temporary stop signs may have been erected for construction and occluded, or any other number of possible scenarios they could have encountered.

2. Waymo does not publicly report data on such mistakes.

I think 2 is more likely than 1.
 
I don't disagree that mistakes would be rare. But that leaves us with two possibilities. Either:

1. In over 20 million miles driven on public roads, Waymo has never once failed to stop at a stop sign. Including cases where temporary stop signs may have been erected for construction and occluded, or any other number of possible scenarios they could have encountered.

2. Waymo does not publicly report data on such mistakes.

I think 2 is more likely than 1.

Agreed. Those 1 in a million bugs take a long-time to resolve, especially if they can go undetected when they do occur.

2 a) Waymo safety driver, remote staff, the customer, and general public (pedestrians) did not notice (or complain) that the Waymo vehicle ran through the stop-sign either. They only report mistakes that are noticed.

Stopping late at a stop-sign is much easier to detect than not stopping at all, because the vehicle itself can be programmed to log that event. It began stopping due to the detection of the sign, and will flag all cases where rapid deceleration is required for review.

Edit: The only way to detect this mistake in an automated fashion would be to run older trips against the newest HD Map (particularly where new signs were added to the map) to see where behaviour doesn't match, then review trip video footage to see if the sign is actually new in the ground or just newly detected. This mechanism could catch several types of sign problems: speed limits, school zones, no left/right turn, etc. Figuring out why a sign was missed may help detect additional signs the system does not yet know about.
 
Last edited:
  • Like
Reactions: willow_hiller
Agreed. However, if San Francisco police had ticketed Waymo vehicles for that behaviour, this would have been included as part of their complaint. Instead they basically said Waymo might be a problem because Cruise does have problems.

We can be fairly sure that Waymo vehicles do not regularly break the law in front of San Fran cops.
SF police can't ticket Waymo for moving violations, that's one of complaints SFMTA has. So even if a cop saw it, no such tickets or reports would exist! They can only ticket for equipment, for example a broken light.
 
Last edited:
  • Informative
Reactions: willow_hiller
They are required to report everything.

IMAGRY
n3dOJAJ.png

AURORA
O5Wx4iF.png

PONY AI
J50vKBO.png

ZOOX
UPpE93H.png


Nobody is claiming Waymo is perfect, they just haven't had to report running stop signs because their software is robust. Their safety drivers haven't had to disengage because of stop sign issues.
Those are all disengagements. You missed the point completely. They are not required to report running stop signs. They are only required to report disengagements. Further more, once at commercial stage they aren't even required to report disengagements at all.
 
Last edited:
  • Like
Reactions: willow_hiller
If you look at the videos of people taking driverless Waymo in SF, it responds well to stop signs, obstructed and even temporary ones held by traffic control workers. You won't see a single waymo running a stop sign. Seriously look through Maya on youtube. Also, @JJRicks kept a meticulous record of all his Waymo rides in AZ and there is no report of a stop sign violation.

YouTuber recordings are a small subset of total rides and the prevalence of them is drastically lower than even FSD Beta (which itself also is a small subset of total rides), so I highly doubt looking at them will necessarily give you a good count, especially if the claim is there have been zero or close to zero occurrences.
 
Bentley goes with Mobileye.

Bentley's first all-electric vehicle will feature hands-off self-driving technology, it has been revealed. Speaking with UK outlet AutoCar, Bentley CEO Adrian Hallmark confirmed that the British marque's first EV will initially offer partial hands-off driving on motorways. Fully autonomous driving will come later, potentially through over-the-air software updates.

Hallmark also confirmed that Bentley will use Mobileye’s SuperVision technology. The SuperVision system features 11 cameras and will also be used on the upcoming Porsche Macan EV.

Source: Bentley's First EV To Come With Hands-Off Self-Driving Tech
 
Criminy. Does anyone live in that area and can just path through that intersection the same way?


Someone indeed retested.

And, shocker, car ends up IN the intersection- past the stop line-- just like Ross. Post about it here:


 
Last edited:
  • Informative
Reactions: JB47394
Someone indeed retested.

And, shocker, car ends up IN the intersection- past the stop line-- just like Ross. Post about it here:



It's amazing just how much mental gymnastics almost all of Tesla's fanbase did just to absorb Tesla of any responsibility in this situation.
Instead of simply admitting what was obvious on video. Seems like doing that is somehow considered a crime against humanity and 9,000 excuses will be generated to avoid it.

This should be wake-up call to anyone who is still trying to reason with some Tesla fans.
Its literally impossible.
 
This should be wake-up call to anyone who is still trying to reason with some Tesla fans.
Its literally impossible.
You need a certain degree of belief to believe in camera only L5 and generalized humanoid robots this decade.

I'm just waiting for them to fix my auto-wipers and ship software for adaptive headlights. Auto high beam took about three years. Deep rain didn't work, and now "actual deep rain" will happen in a few Elon months supposedly. But hey, they saved 5 bucks per car, skipping that sensor that everyone uses. Wen is "actual FSD"? 2040?
 
  • Like
Reactions: diplomat33
But hey, they saved 5 bucks per car,
You sure it is only $5 per car? I'm not positive but it looks like Rivian charges ~$180 for the sensor and it has to be replaced when you replace your windshield:

1687784590412.png


Even a BMW rain sensors for a 10 year old car cost ~$110: https://www.partsgeek.com/wbj73j6-bmw-328i-rain-sensor.html (Some of them are closer to $250.)

And that doesn't cover the cost of the harness and everything else in the car.

Yes, I know what they charge for parts isn't what they cost the OEM, but they aren't marking them up 50x.
 
  • Informative
Reactions: stopcrazypp
You sure it is only $5 per car?
Let's cherry pick a meaningless piece of my post and write a wall of text on if I got it exactly right? Let's say it's $20 if you buy 2M and move on? Point is that pretty much everyone would pay $200 more for a car with a functioning auto wiper if given the option.

However, it's doesn't change the fact that you need a certain degree of belief to believe in camera only L5 and generalized humanoid robots this decade. Listen to the e2e panel from CVPR'23 and get grounded in reality on where the science is.