What actually matters is having something available. Suspect Tesla will be the only car company with Level 3 or higher in the next 2+ years. Mobileye says 2025 and they have missed their dates before. Waymo is not even competing in this space. A large movement towards robotaxi service might mitigate this. I'm also dubious as to how safe Tesla system will be. But if it is the only thing available then everyone will flock to it.
Waymo shared a little PR video on Perception. Pretty simple but here is the transcript: "Today, we're breaking down perception. You may be familiar with it but to the Waymo Driver it means a complete awareness of the elements of environment through our custom built software and sensor suite. We generate a contextual view of the road by using machine learning to decipher complex data gathered by our advanced sensor suite in our development and testing. This knowledge prepares our fully autonomous driving technology to handle new and unexpected things, giving us a cohesive understanding of the people and the environment around us - like which lanes are closed for construction, how fast other vehicles are moving, and the color of traffic lights. We're making sense of the complex world around us. We're redefining mobility." https://twitter.com/Waymo/status/1357433806492467209 It seems like this might be the first in several videos on the Waymo Driver.
Oh great, now Waymo is in the dictionary business... I bet Merriam-Webster is shaking in their boots. Deliver a useful product! Stop with the videos and the definitions - let the product speak for itself.
Waymo has delivered a useful product! And the product does speak for itself: 20M autonomous miles in the US, only 47 incidents in 6M miles, and driverless rides available to the public in Phoenix.
If they had delivered a useful product, they would not be compelled to create create random definitions. A useful product would obviate all need for talking about random words, it would show the end users what it can do. They sound like they are trying to find a buyer for their "product" and they need to reassure the buyer with words with re-definitions to try to impress.
Stop! You are sounding very ignorant. Perception is not a random word. It is a real term in autonomous driving. Waymo is not making up anything. Waymo is quoting real definitions. And Waymo is showing the end user what their product can do every day: Anybody in Phoenix can pick up their phone and summon and ride in a fully autonomous car 24/7.
Not anyone in Phoenix, a person in a 50 sq mile area of Phoenix/Chandler - you have to be IN that zone to request/use the service. LOL - no! Perception is a real word, there is not need to redefine it for autonomous driving. The normal definition is enough. Instead of wasting time to $#!t like this they could be trying to - you know - expand the service to be in all of Phoenix area!
Waymo is not redefining Perception. They are providing the correct FSD definition of the term. And it is important to educate the public on what real autonomous driving is and how it works. And they are working very hard to expand the service area. But they can do multiple things at the same time. They can work to expand the service area and also educate the public on what real autonomous driving is.
Sharing for informational purposes. 2020 CA DMV disengagement report came out. Disengagement Reports - California DMV Here is a summary of Waymo. Total annual autonomous miles: 628,838 miles Total annual disengagements: 21 Disengagements per mile: 1 per 29,945 miles Breakdown of disengagements: Disengage for a perception discrepancy for which a component of the vehicle's perception system failed to detect an object correctly - 1 per 78,605 miles Disengage for a recklessly behaving road user - 1 per 628,838 miles Disengage for adverse weather conditions experienced during testing - 1 per 209,613 miles Disengage for incorrect behavior prediction of other traffic participants - 1 per 628,838 miles Disengage for unwanted maneuver of the vehicle that was undesirable under the circumstances - 1 per 78,605 miles Waymo put out a thread on Twitter downplaying the report as not representing the full picture of their FSD capabilities. They link to their 6M miles safety report for more info. https://twitter.com/Waymo/status/135921435050292019 I guess one reason why Waymo does not like the report is because it does not tell you anything about safety. 1 disengagement per 29,945 miles does not say anything about the safety of the autonomous car since disengagements don't necessarily lead to accidents. The safety report shows that the Waymo Driver only gets into about 1 accident per 130,000 miles and all of the accidents were pretty minor. So the safety is actually better than what the disengagement rate might suggest.
Is this the Waymo megathread? Has anyone already discussed the paper Waymo released on their safety record? https://storage.googleapis.com/sdc-prod/v1/safety-report/Waymo-Public-Road-Safety-Performance-Data.pdf I don't have the statistical chops to evaluate the paper. Never took a statistics class. I'm wondering if anyone with statistics credentials thinks this is convincing evidence that Waymo robotaxis are safer than the average human driver.
Yes, this is the Waymo mega thread. And yes, this has already been discussed at length. The 6M miles of data are only from the Phoenix geofenced area. So it does not prove that Waymo robotaxis are necessarily safer than humans everywhere. But the data does provide good evidence that the Waymo robotaxis are safer than human drivers in that Phoenix geofenced area. This article presents some reasons, based on the Waymo data, why the Waymo robotaxis are "significantly superior" to humans in that Phoenix geofence area where they drive: "The report is notable for several reasons: There is incredible transparency, of a sort we have seen from no other team. Indeed, a gauntlet is now thrown in front of all other teams — if you’re not this transparent, we will presume you are not doing so well. They have the ability to be that transparent because the numbers are good. In 6.1 million miles they report 30 “dings” with no injury expected, 9 with 10% chance of injury and 8 with airbag deployment but still 10% chance of injury, suggesting less than 2 modest injuries. Human drivers would have had about 6. All the events had the other driver/road user at fault in some way under the vehicle code, according to Waymo. There were no incidents of single vehicle incidents (ie. driving off the road) which are pretty common with human drivers. Nationally, 6.1 million miles of driving by a good driver should result in about 40-60 events, most of which are small dings, 22-27 or which would involve an insurance claim, 12 which would get reported to police and 6 injury crashes. With no at-fault events in 8 lifetimes of human driving, Waymo’s performance is significantly superior to a human, even in an easy place like Chandler." Waymo Data Shows Superhuman Safety Record. They Should Deploy Today
"We’ve optimized the Waymo Driver’s 360° vision system and lidar to navigate the complexities of urban driving. Our highly sensitive cameras can spot traffic lights changing at a long distance – even among the papel picado on 24th Street – to enable smooth driving. And our cameras and lidar can instantly spot a jaywalker sprinting across our path and act appropriately – even when they emerge suddenly from behind a vehicle in the oncoming lane." "We’ve also designed our software to reason about the context, which is essential for driving safely in busy cities. Our perception system lets our Driver know how to handle a pedestrian, a tree – and a pedestrian carrying a Christmas tree. If we pull up next to a bus by a crosswalk on Beach Street in Fisherman’s Wharf, our Driver can reason that hidden passengers may be getting off, and that they may soon cross the street." "We’re also building greater flexibility into our driving software to handle unexpected changes to the road. If we’re driving on 19th Avenue during road work and our sensors spot traffic cones and road work signs, our perception system understands that they are guiding us out of the usual lane, and our planning and routing systems can automatically update the vehicle’s route to navigate the new layout." Waypoint - The official Waymo blog: Expanding our testing in San Francisco
@powertoold I don't want to derail the other thread with a Waymo discussion. So I am posting this here. Here is the latest video from JJRicks: Yes, the route might be simpler than some FSD Beta videos but the Waymo robotaxi handles it smoothly and confidently with no driver. And the route is not simple. It still involves various driving problems, including unprotected turns, construction zones, school zones, lane changes, traffic, pedestrians, parking lots. And the important thing to remember is that this is not a one time "demo". Waymo robotaxis are doing this route reliably and safely thousands of times, with no driver at all. That's the key. It might be a "simpler" route but Waymo's FSD can handle it with no driver. On a side note, I find it impressive how many objects Waymo is able to display accurately on the screen. it is able to display a lot of objects and show their paths very accurately. I appreciate what Tesla is doing. It's just a different approach. it's like quantity versus quality. Tesla is starting with quantity. Waymo is starting with quality. Tesla is starting with FSD that can be used everywhere but is not reliable and requires a driver and working to make it better. Waymo is starting with FSD that is geofenced but works 100% safely and reliably with no driver and then expanding the area.
The unprotected lefts in this video are odd, if not unsafe: 1) 18:38 - very aggressively turns left almost in front of the white pickup 2) 26:42 - waits in the oncoming traffic lanes to finish an unprotected left
I think it looked good and safe in both cases. It drove more like an aggressive but confident human driver. It was not hesitant. 1) it knew the white pickup would pass safely by the time it made the turn. And it passed behind the white pickup. In 2) It edged forward, waited in the middle until it was safe and then finished the turn when it was clear.
Looks like it's less than 5 feet away from the truck at some point, and the truck was probably going 35+mph
I think it was an unsafe maneuver because the oncoming traffic drivers were facing direct sunlight, one of them could have changed lanes to the left and the Waymo would be very close, if not in the way.
Yes but you have to look at positions and velocities. Maybe it cut it a bit close but the Waymo knew that there was no risk of collision since the truck was moving 35 mph perpendicular and would be out of the path of the Waymo by the time it made the turn. The pickup was moving out of the path of the Waymo.