Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo brings in $2.25 billion from outside investors, Alphabet

This site may earn commission on affiliate links.
They hate it because once you get where they are it's not a way to measure progress. It's also a bad way to compare companies because they test in different places. They believe their true necessary disengagement rate is far lower but they can't tell their test drivers to not intervene when they feel unsafe!
I do not believe Tesla's FSD could drive around San Francisco for even a mile without a disengagement. There were disengagements during the autonomy day demo rides on a single premapped route.
 
  • Like
Reactions: diplomat33
Waymo is willing to risk a ride for reporters to show off (in broad daylight, perfect weather conditions, preplanned route), but they are not willing to place the system in real use. They are too afraid of losing millions in a lawsuit. Plus the bad publicity would cause millions more harm to Google. The system is good, but it won't be launching anytime soon, since there will always be risk. Are there any unprotected left turns at busy intersections?
So people invested $2.25 billion for a system that they don't plan to deploy? o_O
They do appear to be driving around Chandler with no drivers.
 
  • Like
Reactions: diplomat33
They hate it because once you get where they are it's not a way to measure progress. It's also a bad way to compare companies because they test in different places. They believe their true necessary disengagement rate is far lower but they can't tell their test drivers to not intervene when they feel unsafe!
I do not believe Tesla's FSD could drive around San Francisco for even a mile without a disengagement. There were disengagements during the autonomy day demo rides on a single premapped route.

But you are taking the disengagement figure and assuming that it relates to driving around San Francisco. It might have been generated on city streets in the rush hour, or at 3am when it's quiet, or even on highways which are a much more forgiving environment.

For better or worse, self-driving has been identified as the next must-have accessory, so there is a huge incentive for companies to get their names /technology into the minds of the public. Whatever does that is worth it, including 2.5 billion that makes everyone sit up and go "Ooooo"
 
But you are taking the disengagement figure and assuming that it relates to driving around San Francisco. It might have been generated on city streets in the rush hour, or at 3am when it's quiet, or even on highways which are a much more forgiving environment.

For better or worse, self-driving has been identified as the next must-have accessory, so there is a huge incentive for companies to get their names /technology into the minds of the public. Whatever does that is worth it, including 2.5 billion that makes everyone sit up and go "Ooooo"
All their accidents are in San Francisco so it seems plausible. Report of Traffic Collision Involving an Autonomous Vehicle (OL 316)
 
  • Helpful
Reactions: malcolm
But you are taking the disengagement figure and assuming that it relates to driving around San Francisco. It might have been generated on city streets in the rush hour, or at 3am when it's quiet, or even on highways which are a much more forgiving environment.

It is true that we don't know exactly what roads or conditions make up the 800k miles that Cruise did last year. But, I think we can reasonably assume that Cruise's 800k of autonomous miles were made up of lots of different driving scenarios. For one, Cruise's 800k miles were not a joy ride. They were testing their cars in specific situations that they want their autonomous driving to be handle. So it stands to reason that Cruise would have included some difficult driving cases in the 800k miles. Also, I highly doubt that Cruise filmed a couple hours of demos and then spent 800k miles over an entire year, just driving on empty streets in order to cheat the disengagement report. That would be a total waste of time and resources. Lastly, statistically, 800k is a pretty nice size sample. If Cruise picked a balance set of different routes, then easy routes would roughly cancel out the more difficult routes so the disengagement rate would be a pretty representative average. I am not saying the disengagement rate is a perfect metric but I think it would be pretty accurate average of all the routes around San Francisco where we know Cruise does their testing.
 
Do they only have to report disengagements that involve accidents?

No, they have to report ALL disengagements. The CA report lists what each disengagement was:

Cruise had a total of 68 disengagements that were caused by the following:
"precautionary takeover to address perception, AV made unsuccessful left turn"
"precautionary takeover to address perception, other road user behaving poorly"
"precautionary takeover to address planning, precautionary takeover to address controls, AV made unsuccessful left turn"
"precautionary takeover to address planning, third party lane encroachment"
"precautionary takeover to address planning, AV lane change issues"
"precautionary takeover to address planning, third party lane obstruction"

Source:
https://www.dmv.ca.gov/portal/wcm/c...cleDisengagementReports.csv?MOD=AJPERES&CVID=
 
Last edited:
It is true that we don't know exactly what roads or conditions make up the 800k miles that Cruise did last year. But, I think we can reasonably assume that Cruise's 800k of autonomous miles were made up of lots of different driving scenarios. For one, Cruise's 800k miles were not a joy ride. They were testing their cars in specific situations that they want their autonomous driving to be handle. So it stands to reason that Cruise would have included some difficult driving cases in the 800k miles. Also, I highly doubt that Cruise filmed a couple hours of demos and then spent 800k miles over an entire year, just driving on empty streets in order to cheat the disengagement report. That would be a total waste of time and resources. Lastly, statistically, 800k is a pretty nice size sample. If Cruise picked a balance set of different routes, then easy routes would roughly cancel out the more difficult routes so the disengagement rate would be a pretty representative average. I am not saying the disengagement rate is a perfect metric but I think it would be pretty accurate average of all the routes around San Francisco where we know Cruise does their testing.

"reasonably assume" "stands to reason" "I highly doubt"

See? Isn't this how you ended up buying Tesla's AP?
 
Yeah, reusable rockets. So 1980s
Yeah. Not exactly cost effective but it did look cool.
oYzKoioRNcDbFALHXXxCx7-650-80.jpg
 
  • Funny
Reactions: malcolm
Had a look at a few Cruise accident reports. Seem to be throughout the day.

Do they only have to report disengagements that involve accidents?
They are required to report all disengagements. In many of the accidents the safety driver disengaged immediately before the accident. Most of the accidents are people rear ending their test cars (probably phantom braking!).
 
And does anyone double check this?

Again, there is a lot of money riding on the success of these systems.

And don't say that a car maker wouldn't dream of manipulating safety data. It's happened before, it will happen again.

Well, they are required by law to report all disengagements. Could they still try to cheat? Sure. But if they got caught cheating, there would be consequences.
 
And does anyone double check this?

Again, there is a lot of money riding on the success of these systems.

And don't say that a car maker wouldn't dream of manipulating safety data. It's happened before, it will happen again.
I don't understand all the conspiracy talk. There is no way to cheat when it comes to self-driving. Sure if you have a safety driver you can cheat but a self-driving car with a safety driver is 100% useless. Once you remove the safety driver there is no way to cheat. What are they going to do, have their cars flee the scene of accidents? They're easy to identify...
I doubt any of these companies are going to go public until they have deployed.
 
I don't understand all the conspiracy talk. There is no way to cheat when it comes to self-driving. Sure if you have a safety driver you can cheat but a self-driving car with a safety driver is 100% useless. Once you remove the safety driver there is no way to cheat. What are they going to do, have their cars flee the scene of accidents? They're easy to identify...
I doubt any of these companies are going to go public until they have deployed.

There is no way to cheat self-driving, but how you measure and present progress toward that goal is open to manipulation.

They have to report disengagements that involve accidents and there are others where the safety driver just took over. Are there others? Disengagements say nothing about average speed, hesitancy, transitions between road types, and other factors which may or may not irritate passengers or other road users.

We all have crude labels for other drivers; grandma, menace, idiot, Audi owner .... and worse ;)

I can imagine self-driving cars being called carseholes by other road users, partly because of the silly sensors, but also because of a perceived type of driving behaviour.

I guess my problem is that it isn't solved until it is solved. Appearing to be close to a working product isn't the same as having a working product.