If you can drive around San Francisco and only disengage every 10k miles that seems like they’re making real progress
You would think so, wouldn't you...
Everyone hates California’s self-driving car reports
You can install our site as a web app on your iOS device by utilizing the Add to Home Screen feature in Safari. Please see this thread for more details on this.
Note: This feature may not be available in some browsers.
If you can drive around San Francisco and only disengage every 10k miles that seems like they’re making real progress
They hate it because once you get where they are it's not a way to measure progress. It's also a bad way to compare companies because they test in different places. They believe their true necessary disengagement rate is far lower but they can't tell their test drivers to not intervene when they feel unsafe!
So people invested $2.25 billion for a system that they don't plan to deploy?Waymo is willing to risk a ride for reporters to show off (in broad daylight, perfect weather conditions, preplanned route), but they are not willing to place the system in real use. They are too afraid of losing millions in a lawsuit. Plus the bad publicity would cause millions more harm to Google. The system is good, but it won't be launching anytime soon, since there will always be risk. Are there any unprotected left turns at busy intersections?
They hate it because once you get where they are it's not a way to measure progress. It's also a bad way to compare companies because they test in different places. They believe their true necessary disengagement rate is far lower but they can't tell their test drivers to not intervene when they feel unsafe!
I do not believe Tesla's FSD could drive around San Francisco for even a mile without a disengagement. There were disengagements during the autonomy day demo rides on a single premapped route.
All their accidents are in San Francisco so it seems plausible. Report of Traffic Collision Involving an Autonomous Vehicle (OL 316)But you are taking the disengagement figure and assuming that it relates to driving around San Francisco. It might have been generated on city streets in the rush hour, or at 3am when it's quiet, or even on highways which are a much more forgiving environment.
For better or worse, self-driving has been identified as the next must-have accessory, so there is a huge incentive for companies to get their names /technology into the minds of the public. Whatever does that is worth it, including 2.5 billion that makes everyone sit up and go "Ooooo"
Trying to solve the most difficult engineering problem ever. Doing something that has been done before is usually cheaper.Now that I think about it 2.25 billion is a ridiculous amount of money.
Falcon 9 development was at most 400 million and Falcon Heavy was around 500 million.
What the hell is Waymo doing?
But you are taking the disengagement figure and assuming that it relates to driving around San Francisco. It might have been generated on city streets in the rush hour, or at 3am when it's quiet, or even on highways which are a much more forgiving environment.
All their accidents are in San Francisco so it seems plausible. Report of Traffic Collision Involving an Autonomous Vehicle (OL 316)
Trying to solve the most difficult engineering problem ever. Doing something that has been done before is usually cheaper.
Do they only have to report disengagements that involve accidents?
It is true that we don't know exactly what roads or conditions make up the 800k miles that Cruise did last year. But, I think we can reasonably assume that Cruise's 800k of autonomous miles were made up of lots of different driving scenarios. For one, Cruise's 800k miles were not a joy ride. They were testing their cars in specific situations that they want their autonomous driving to be handle. So it stands to reason that Cruise would have included some difficult driving cases in the 800k miles. Also, I highly doubt that Cruise filmed a couple hours of demos and then spent 800k miles over an entire year, just driving on empty streets in order to cheat the disengagement report. That would be a total waste of time and resources. Lastly, statistically, 800k is a pretty nice size sample. If Cruise picked a balance set of different routes, then easy routes would roughly cancel out the more difficult routes so the disengagement rate would be a pretty representative average. I am not saying the disengagement rate is a perfect metric but I think it would be pretty accurate average of all the routes around San Francisco where we know Cruise does their testing.
Yeah. Not exactly cost effective but it did look cool.Yeah, reusable rockets. So 1980s
They are required to report all disengagements. In many of the accidents the safety driver disengaged immediately before the accident. Most of the accidents are people rear ending their test cars (probably phantom braking!).Had a look at a few Cruise accident reports. Seem to be throughout the day.
Do they only have to report disengagements that involve accidents?
No, they have to report ALL disengagements.
Yeah. Not exactly cost effective but it did look cool.
View attachment 522903
And does anyone double check this?
Again, there is a lot of money riding on the success of these systems.
And don't say that a car maker wouldn't dream of manipulating safety data. It's happened before, it will happen again.
The solid boosters were also reusable. The fuel tank was not.LOL. It did indeed. Sadly the only section that landed was the one with the human pilots.
I don't understand all the conspiracy talk. There is no way to cheat when it comes to self-driving. Sure if you have a safety driver you can cheat but a self-driving car with a safety driver is 100% useless. Once you remove the safety driver there is no way to cheat. What are they going to do, have their cars flee the scene of accidents? They're easy to identify...And does anyone double check this?
Again, there is a lot of money riding on the success of these systems.
And don't say that a car maker wouldn't dream of manipulating safety data. It's happened before, it will happen again.
I don't understand all the conspiracy talk. There is no way to cheat when it comes to self-driving. Sure if you have a safety driver you can cheat but a self-driving car with a safety driver is 100% useless. Once you remove the safety driver there is no way to cheat. What are they going to do, have their cars flee the scene of accidents? They're easy to identify...
I doubt any of these companies are going to go public until they have deployed.