Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
But none of them has been safety related, how is people unable to understand this?
Being stopped in the middle of moving traffic for no reason isn't a safety related issue? He could have been rear ended and injured at any point during this last event.

Or are you just trying to say it didn't stop for a safety related issue? :rolleyes: Because it certainly became safety related the second it decided to stop in traffic.
 
If Waymo can't get their act together with HD maps, the best machine learning experts, and the most experience, how do we expect any of these other Waymo approach companies to succeed? Heck, can Tesla beat Waymo's performance in a general approach? Even if Tesla does beat Waymo's performance in a general approach, they'll need to be far better (than one disengagement every ~1000 miles) to start a robotaxi service.

We know how tesla performs, they have a safety disengagement every 1-5miles on average.
While the waymo driver got confused due to the input provided by the Remote assistance.
It never tried to crash into objects or cones. That's what safety disengagement stats is all about.

And if we are trying to compare...This is how FSD Beta handles construction and it hasn't improved.
The thing about the Waymo driver is that while it can't handle all situations, it won't crash due to it failing to handle a situation.
Unlike FSD Beta that will confidently drive you into a barrier, wall, cones, construction cars or off a cliff.
 
Being stopped in the middle of moving traffic for no reason isn't a safety related issue? He could have been rear ended and injured at any point during this last event.

Or are you just trying to say it didn't stop for a safety related issue? :rolleyes: Because it certainly became safety related the second it decided to stop in traffic.

It sure created a safety issue for all the cars that had to go into oncoming traffic lanes to get around the Waymo.
 
Really, you're absolutely sure about that? There is no chance that Waymo fails?

Of course we know they will never be able to "remove all roadside assistance" as cars break down so at a minimum they will need to keep them around for flat tires and other unexpected break-downs.

In case of system failure like a flat tire, Waymo is designed to pull over safely.

But basically, Waymo just needs to get the road assistance for cases where the FSD does not know what to do, down to a minimum, where it happens extremely rarely.
 
It never tried to crash into objects or cones.

It came really close to hitting a vehicle that had driven around the stopped vehicle at around 23:46 in the video:


download.gif
 
Being stopped in the middle of moving traffic for no reason isn't a safety related issue? He could have been rear ended and injured at any point during this last event.

Or are you just trying to say it didn't stop for a safety related issue? :rolleyes: Because it certainly became safety related the second it decided to stop in traffic.

No absolutely not. This isn't different from a stopped car taking long to make a left turn into a street/road/store parking lot or the first car at an intersection at a red light and the very first car behind them stopping and then hence-forth creating a back up of 5-10 cars.

Anyone who thinks they were in danger to be rear ended need to stop driving ASAP. Because THEY are the danger. This is within the natural flow and event of daily driving. No different from the risk you have when you exited your garage.
 
Last edited:
  • Disagree
Reactions: MP3Mike
In case of system failure like a flat tire, Waymo is designed to pull over safely.

But basically, Waymo just needs to get the road assistance for cases where the FSD does not know what to do, down to a minimum, where it happens extremely rarely.
Well, since it appeared to judge the curb lane as undriveable space it didn't seem to know where to go to pull over safely. It stopped as far right as it could. Unfortunately it was in the middle of the road, but it was pulled over as far to the right as it thought it could go. We would pull off into that construction curb lane in a breakdown even if it was a "closed" section because it would be less risky. The Waymo car could have had a better failure mode strategy.
 
  • Helpful
Reactions: diplomat33
Not even close. like seriously? That happens literally everyday when someone cuts you off.

Which vehicle would be held at fault if a collision happened in that scenario? The silver car was pulling around a seemingly incapacitated vehicle, and the Waymo was actively steering out into them.

This wasn't just being "cut off," this was a remote operator failure which caused a situation the Waymo driving policy was unable to handle. It's pretty clear to me that the vehicle placed itself in an unsafe situation.
 
Here is the current text of the job posting I wrote looking for an expert to write a paper analyzing Waymo's safety data in combination with other automotive safety data. I would appreciate any feedback or constructive critique in order to improve the proposed methodology.

I'm looking for one or more experts to author an original research paper based on the data presented in the paper "Waymo Public Road Safety Performance Data". The paper is attached to this job posting and is also viewable at: https://bit.ly/waymopaper
Specifically, I want the expert(s) to:
1. Find a naturalistic human driving data set that rigorously counts all contacts/collisions, not just police-reported ones.
2. Use any available data sources (including those comprising police-reported vehicle collisions) to estimate how much lower (or higher) the rates of collisions are in Waymo's operating areas in the Phoenix suburbs versus the figures in the naturalistic driving data set.
3. With this comparison in (2), adjust the rates of collisions from the naturalistic human driving data set accordingly in order to enable an apples-to-apples comparison with the collision rates presented in Waymo's paper.
4. Using these adjusted rates from (3), assess how well Waymo's collision rates stack up against these new human benchmarks derived from the naturalistic driving data.
 
  • Like
Reactions: Microterf
Allready 2-3 years ago Amnon Shashua said “sensing is done”, meaning the problem is the driving policy and planning. I think that this video makes it clear.
This is what makes me pessimistic about Tesla getting to Level 5 anytime in the next few years. They're still figuring out perception. I can't imagine they're very far along on driving policy and planning.
 
It’s as Karpathy says, it’s better to sacrifice sensing for scale.

That makes no sense. If you sacrifice sensing, your car will have a harder time detecting objects which will make FSD less safe.

If we start fitting all the recent Waymo news together, it doesn't look good for the Waymo approach:

Long time CEO, CFO, partnerships guy, etc. leaving the company

No service expansion in the Chandler area

This big messy disengagement, where there was mass incompetence and people having no idea what's going on or how to stop the car?

If Waymo can't get their act together with HD maps, the best machine learning experts, and the most experience, how do we expect any of these other Waymo approach companies to succeed? Heck, can Tesla beat Waymo's performance in a general approach? Even if Tesla does beat Waymo's performance in a general approach, they'll need to be far better (than one disengagement every ~1000 miles) to start a robotaxi service.

I am not worried about Waymo.

Waymo's approach is fine:
- They have good perception, prediction and planning.
- They have 20M autonomous miles of experience. The Waymo Driver can handle tens of thousands of edge cases. The fleet is learning.
- They have ride-hailing in one location and poised to launch in SF soon.
- They are demonstrating driverless operation.
- I am sure Waymo has already learned from that bad experience in JJ's video and improved the software and the remote assistance protocols.
- The 5th gen hardware is even better and cheaper.

I am more worried about Tesla's approach:

In 5 years:
- They are still at L2.
- They are still doing rewrites on basic perception.
- They are still figuring out which basic sensors to use, ditching a major sensor like radar and needing to revalidate their entire stack to make sure it is still safe.
- FSD beta only has about 150,000 miles of experience.
- They have no robotaxis, not even in testing.
- No ride-hailing service.
 
Last edited:
What is the cost per mile Waymo is charging it's passengers? What is the average time it takes the self driving Waymo to pickup a passenger?

$0.80 per mile with a $4.99 minimum.
Average wait time is about 11 minutes according to JJ's spreadsheet