Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Looks like Waymo is working with Geely to go the Cruise route?

 
  • Like
Reactions: diplomat33
Looks like Waymo is working with Geely to go the Cruise route?


I think it makes sense. Existing vehicles like the I-Pace or the Pacifica were designed to be consumer cars with a human driver. So while they can great for testing autonomous driving and can certainly be used as a robotaxi, they might not be ideal for that purpose. Autonomous ride-hailing is very different from just regular personal driving. So it makes sense to design a vehicle from the ground up that can better meet the needs of autonomous ride-hailing. You can design the vehicle to optimize the sensor coverage. You can redesign the interior to be more rider-centric, with higher capacity, and maximize revenue. You can also remove stuff that a robotaxi does not need and design the vehicle to be more easily mass produced which will cut costs. Increasing rider capacity and reducing cost is very important for a robotaxi service to maximize profit. Ultimately, I think a custom designed robotaxi will be essential to have a truly successful and profitable autonomous ride-hailing service long term because you need a platform that is optimized for scalable, profitable, autonomous ride-hailing.
 
Wouldn't it be just an autonomous bus? Go to the Bus stop and it will pick you up. I would prefer door to door service without everyone in the robotaxi knowing where I live.

My guess is that Waymo will probably offer different vehicles for their ride-hailing to meet different needs. This new Zeekr robotaxi will expand the vehicle choices. So if you want a ride just for you, you could summon an I-Pace. If you are a family, you could summon a Pacifica. But the Zeekr robotaxi is designed for high accessibility so if you are a person with low mobility, you could summon the Zeekr robotaxi.
 
A new video that highlights a couple of Waymo Driver's capabilities.


The fact that the Waymo Driver can recognize that a cone next to a truck means the truck is double parked and it should go around it, is really good IMO. That is the kind of scene understanding that humans have, that AVs also need in order to be smarter and not get stuck as much.

Also, the reference at the end to ML that learn from human preferences would seem to imply that Waymo is doing imitation learning. The last bit would seem to suggest that Waymo is applying imitation learning to city driving.
 
Last edited:
Interesting lawsuit: Waymo sues California DMV to keep robotaxi safety details secret

Especially the particular details they want to keep private:
The topics Waymo wants to keep hidden include how it plans to handle driverless car emergencies, what it would do if a robot taxi started driving itself where it wasn’t supposed to go, and what constraints there are on the car’s ability to traverse San Francisco’s tunnels, tight curves and steep hills. Waymo also wants to keep secret descriptions of crashes involving its driverless cars.

I can see how emergency plans could be proprietary (or dangerous to make public), but I don't think they can make the same argument for crash descriptions.
 
  • Like
Reactions: Terminator857
Interesting lawsuit: Waymo sues California DMV to keep robotaxi safety details secret

Especially the particular details they want to keep private:


I can see how emergency plans could be proprietary (or dangerous to make public), but I don't think they can make the same argument for crash descriptions.
Hard to know if there are trade secrets there since it's redacted. haha
Here are the public reports of those two incidents shown in the article.
A Waymo Autonomous Vehicle (“Waymo AV”) traveling northbound on Church Street at 21st Street in San Francisco was involved in a collision. While stopped in autonomous mode, the Waymo AV switched to reverse gear in preparation for making a multipoint turn, just as the test driver disengaged by applying the accelerator pedal. The Waymo AV made contact with a passenger vehicle behind it at approximately 3 MPH. The Waymo AV sustained minor damage to its rear bumper, and the passenger vehicle sustained minor damage to its front bumper. No injuries were reported at the scene.
A Waymo Autonomous Vehicle (“Waymo AV”) traveling westbound on Filbert Street at Scott Street in San Francisco was involved in a collision. After coming to a complete stop in autonomous mode, the Waymo AV switched to reverse gear in preparation for making a multipoint turn. Within fractions of a second, the test driver applied the accelerator pedal to disengage autonomous mode. The Waymo AV made contact with a passenger vehicle that was stopped behind it at approximately 3 MPH. The Waymo AV sustained minor damage to its rear bumper, and the passenger vehicle sustained minor damage to its front bumper. No injuries were reported at the scene.
Sounds pretty embarrassing to have two nearly identical failures!
 
Sounds pretty embarrassing to have two nearly identical failures!
Oh I see what happened. The car autonomously shifted into reverse, the test driver didn't notice, hit the accelerator thinking they were still in drive, then crashed into the car behind them.
This is a pretty bad safety flaw in their testing. I wonder how they fixed it... and of course why they didn't fix it after the first collision.
 
Oh I see what happened. The car autonomously shifted into reverse, the test driver didn't notice, hit the accelerator thinking they were still in drive, then crashed into the car behind them.

How would the Waymo hit the car behind it by hitting the accelerator? The Waymo did autonomously shift into reverse to do a multipoint turn. But I think the safety driver hit the accelerator to try to avoid getting rear ended but too late to avoid the collision. Note that the Waymo suffered damage to its rear fender and the other car suffered damage to its front bumper. So the car hit the Waymo from behind. I suspect the driver behind the Waymo might have been surprised by the Waymo shifting into reverse or maybe was not paying attention, which led to rear ending the Waymo.

I do wonder how many of these accidents happened? If those 2 accidents are the only ones of that type in the 2.7M miles Waymo did last year, then they are very rare. Having said that, I trust Waymo has analyzed the accidents by now (we know they replay disengagements in their simulations) and have improved their software to reduce these types of collisions.
 
Last edited:
...They need a Teslabot that can get out the car and yell at the person blocking them.
Optimus Full Driver Assistance Capability $15,000
  • Gas-pump and Supercharger Hook-Up with snack fetching
  • Close quarters hand-signaling assistance
  • Tesla v11 UI menu-diving
  • Blank Staring at median panhandlers
  • Diverse culture-adaptive Driver Insult Exchange with Enhanced Hand Gestures (New York ODD requires High-Capaciy Database Option)
Coming Soon
  • Robo Road Rage on City Streets
 
How would the Waymo hit the car behind it by hitting the accelerator? The Waymo did autonomously shift into reverse to do a multipoint turn. But I think the safety driver hit the accelerator to try to avoid getting rear ended but too late to avoid the collision. Note that the Waymo suffered damage to its rear fender and the other car suffered damage to its front bumper. So the car hit the Waymo from behind. I suspect the driver behind the Waymo might have been surprised by the Waymo shifting into reverse or maybe was not paying attention, which led to rear ending the Waymo.

I do wonder how many of these accidents happened? If those 2 accidents are the only ones of that type in the 2.7M miles Waymo did last year, then they are very rare. Having said that, I trust Waymo has analyzed the accidents by now (we know they replay disengagements in their simulations) and have improved their software to reduce these types of collisions.
No I suspect the safety driver thought the car was stuck (maybe it was) or impeding traffic. However a split second before they disengaged by pressing the accelerator it shifted into reverse so they went backwards instead of forward.
Really it seems like accelerator should always mean forward unless the safety driver themselves engages reverse.
 
How would the Waymo hit the car behind it by hitting the accelerator?
Do you understand how reverse gear works?

But I think the safety driver hit the accelerator to try to avoid getting rear ended but too late to avoid the collision.
Vehicle 2 was "Stopped in Traffic" in both cases. Waymo was "Backing". No need for wild theories, the accident reports are quite clear.

If those 2 accidents are the only ones of that type in the 2.7M miles Waymo did last year, then they are very rare.
11 days apart does not indicate rarity. I'm sure they took corrective action in the 10 months since, though.

Why did the cars attempt 3 point turns at intersections on public streets? The Church St one looks particularly dicey. Maybe a road closed situation, or a wreck with a cop sending traffic back the way it came? I wonder if this led to that later news story about tons of Waymos going down that dead end street and turning around? We'll probably never know. Waymo is quite transparent when it meets their marketing goals, otherwise their lips are sealed.
 
Why did the cars attempt 3 point turns at intersections on public streets?
Vehicle 2 was "Stopped in Traffic" in both cases. Waymo was "Backing".
Yes - kind of weird.

No wonder they are suing.

Isn't it interesting that on a Tesla dedicated website some people are always demanding more transparency from Tesla and less from others ? I wonder what the motivations are.
 
  • Funny
Reactions: Daniel in SD