Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Here is what the app looks for applying to the trusted tester program in SF:

E9o3VDzUYAQQ9mI
 
  • Like
Reactions: diplomat33
As a comparison, has anyone ever posted the NDA terms for FSD Beta? It would be interesting to know what they are forbidden from showing.

Have we ever had someone break the NDA to talk about it in detail? Some quick searching doesn't turn up much more than speculation, and a confirmation that there is a FSD Beta NDA. Only detail I can find on forbidden disclosure so far is live streaming:

 
This corresponds with what I had read earlier. My take is that this is not very restrictive at all, and the Tesla FSD beta is remarkably open, with various testers deciding where & when to drive, immediate and un-censored video upload after the drive, and various creative testing & video production methods.

The restriction about not live-streaming from the car during the drive is completely reasonable IMO. I think there are multiple reasons, but certainly one of them is that if there were to be any accident, the video becomes investigation & litigation evidence - and as we see time and again, not always a complete & reliable picture of what actually transpired before, during and after an incident. I've never seen the NDA but it would make sense that testers are instructed not to upload anything if an accident does occur, and possibly even if said accident did not involve the Tesla itself.

I'm not about to criticize Waymo for their NDA terms, but I think no one can reasonably accuse Tesla of being secretive in comparison.
 
This corresponds with what I had read earlier. My take is that this is not very restrictive at all, and the Tesla FSD beta is remarkably open, with various testers deciding where & when to drive, immediate and un-censored video upload after the drive, and various creative testing & video production methods.

The restriction about not live-streaming from the car during the drive is completely reasonable IMO. I think there are multiple reasons, but certainly one of them is that if there were to be any accident, the video becomes investigation & litigation evidence - and as we see time and again, not always a complete & reliable picture of what actually transpired before, during and after an incident. I've never seen the NDA but it would make sense that testers are instructed not to upload anything if an accident does occur, and possibly even if said accident did not involve the Tesla itself.

I'm not about to criticize Waymo for their NDA terms, but I think no one can reasonably accuse Tesla of being secretive in comparison.
If Tesla is not being secretive, could they post the NDA terms? Or is posting the full NDA terms a violation of the NDA terms? (aka: The First Rule of Fight Club).

We have seen several videos of people actively testing whether FSD would run over their spouse/kids/friends but no known failures shown or collisions into cars/large objects or injury to live beings. Several curb strikes, and some small objects run over during tests.

I'm glad to see they are as lenient as they are in allowing videos of test scenarios. Somehow the tests I want to see never get aired. Carry through a test past the "so then I had to brake to avoid the collision" phase (obviously only with inflatable objects or paper models). I can't believe there are no collisions with all the near-misses so they must be failing to report them publicly. Possibly you are not allowed to show objects hitting your car (even inflatable or boxes)?
 
Possibly you are not allowed to show objects hitting your car (even inflatable or boxes)?
Didn't DirtyTesla have a video he showed testing against inflatable? deer? I think it was an early FSD beta, but maybe not?

I think the SVOC had videos in the past of trying to run down boxes... (I don't recall if they hit any or if it always stopped or went around them.)

So the tests have been done, on older versions...
 
Didn't DirtyTesla have a video he showed testing against inflatable? deer? I think it was an early FSD beta, but maybe not?

I think the SVOC had videos in the past of trying to run down boxes... (I don't recall if they hit any or if it always stopped or went around them.)

So the tests have been done, on older versions...
Yes, Dirty Tesla did knock over a toy dog. I'm looking for something more drastic, faster speeds and bigger. But yes, you are correct.
 
Looks like we got a clip from an early rider in SF. Looks like the NDA does not preclude social media posts:

It's hard to read, but the wheel appears to say:
Please keep your hands off the wheel
The Waymo Driver is in control at all times


Who is the Waymo Driver, the car or the person. Are those contradictory statements if it's the person?

Wheel.png
 
It's hard to read, but the wheel appears to say:
Please keep your hands off the wheel
The Waymo Driver is in control at all times


Who is the Waymo Driver, the car or the person. Are those contradictory statements if it's the person?

View attachment 701485

Yes, that is what it says. "Waymo Driver" is Waymo's name for their FSD system. It is referring to the car. So it is saying "please keep your hands off the steering wheel, the FSD system in the car is in control".
 
Last edited:
  • Informative
Reactions: Dan D.
It's hard to read, but the wheel appears to say:
Please keep your hands off the wheel
The Waymo Driver is in control at all times


Who is the Waymo Driver, the car or the person. Are those contradictory statements if it's the person?

View attachment 701485
They added this language to the Pacificas when they went drivlerless. It's partly a dig at Tesla, but it also tells passengers not to touch the controls because their "Waymo Driver" s/w handles everything. Except during ConeGate, ha. Brad Templeton argues Waymo support should have been allowed to put the car in "customer control" mode and let JJ Ricks drive it at low speed to a safe place. Or they could at least let him use the touchscreen to select a path. This brings up issues, e.g. liability, what if the rider is unlicensed, or drunk, etc. But it would also solve a lot of problems.

Not sure why they have this message on the i-Paces. They can't tell the safety driver to keep his hands off the wheel -- putting them on the wheel when needed is his main job.
 
  • Like
Reactions: rxlawdude
Brad Templeton argues Waymo support should have been allowed to put the car in "customer control" mode and let JJ Ricks drive it at low speed to a safe place. Or they could at least let him use the touchscreen to select a path. This brings up issues, e.g. liability, what if the rider is unlicensed, or drunk, etc. But it would also solve a lot of problems.

Waymo will never do that because of the liability issues. Plus, it completely goes against Waymo's core vision. Waymo says that their entire reason for pursuing L4 in the first place is that the human driver can't be trusted and that their goal is autonomous driving that replaces the human driver.

Not sure why they have this message on the i-Paces. They can't tell the safety driver to keep his hands off the wheel -- putting them on the wheel when needed is his main job.

Probably because their plan is to remove the safety driver at some point.
 
Waymo will never do that because of the liability issues. Plus, it completely goes against Waymo's core vision. Waymo says that their entire reason for pursuing L4 in the first place is that the human driver can't be trusted and that their goal is autonomous driving that replaces the human driver.
And there's the rub. Human drivers can't be trusted, but Waymo cannot possibly trust a prediction of what that human in an oncoming vehicle threatening collision will do. Humans tend to intuit these situations better than autonomous vehicles.

Your contention suggests that until all vehicles are autonomous, autonomous systems cannot "trust" any prediction of what imperfect humans will do. This is where the human back-up in FSD beats the crap out of an autonomous vehicle that can't get out of a line of traffic cones.
 
And there's the rub. Human drivers can't be trusted, but Waymo cannot possibly trust a prediction of what that human in an oncoming vehicle threatening collision will do. Humans tend to intuit these situations better than autonomous vehicles.

Your contention suggests that until all vehicles are autonomous, autonomous systems cannot "trust" any prediction of what imperfect humans will do. This is where the human back-up in FSD beats the crap out of an autonomous vehicle that can't get out of a line of traffic cones.

That's not what Waymo is talking about. They are not talking about trusting humans in other cars. In context, Waymo is talking about not trusting humans to supervise a L2 "FSD" system because humans will get complacent. Waymo does not believe you can safely do L2 "FSD". So Waymo argues that the best way to do FSD is to develop a system that does not need a human at all. Hence why Waymo is focused on solving L4.
 
Last edited:
SF Chronicles has a review of the Waymo rides in SF. It is behind a paywall.

But here are some highlights:

Wednesday’s jaunt in Waymo’s self-driving Jaguar I-Pace had no surprises. The car smoothly glided to a stop at intersections and signaled before turns. The closest thing to a hiccup was a slight jerkiness when it pulled over to drop off us faux ride-hailing passengers. The company confined our 20-minute test drive to the wide, relatively empty streets of the Sunset district, avoiding the hills and traffic elsewhere in the city, so the car didn’t face any challenges. Ever Guardado, our safety driver, never needed to assume control, but did so once at my request just to demonstrate the tones the car sounds when a driver takes over.
Four years ago when I rode in an earlier-generation Waymo robot car at its testing site on a former military base in Merced County, Waymo staged several complex situations: a bicyclist pedaling alongside, a pedestrian wandering into the crosswalk, a car abruptly cutting off the robot car and boxes sprawling into the road. That robot car, which did not have a person behind the wheel, adapted quickly to all the tests thrown at it.
For now San Francisco rides are free — Waymo doesn’t yet have permission to charge a fee. And although it’s had permission since 2018 to ditch backup drivers in California, it has not yet done so. Rival Cruise has been testing some cars without backup drivers in San Francisco for nine months.

But McGoldrick said its trajectory here to fully autonomous paid rides will be quicker than that in Phoenix.
Why test robot taxis in San Francisco? “It’s a major ride-hailing market with a lot of demand from residents,” said spokeswoman Julianne McGoldrick. “And it’s one of the toughest environments for driving — weather, traffic, road grade, cyclists.”
“The low-hanging fruit in this field is some combination of slow speeds, simple environments and semi-supervised operations,” said Bryant Walker Smith, a long-time industry observer and affiliate scholar at the Stanford Law School Center for Internet and Society, in an email. “Waymo and Cruise have long been more interested in moon shots.”
Those companies need to find “sweet spots that are both technically and financially feasible,” he said, noting that while Phoenix is technically much easier than the Bay Area, it’s also much less lucrative.
Current ride-hailing has low expenses, he said “an Uber trip is basically just a poorly paid driver in a midtier car.”

But the robot cars fleets, even though they eliminated the labor expenses, have to cover costs for cars that mount up to several hundred thousand dollars each, “as well as a vast infrastructure for remote human monitoring and support.”

1200x0.jpg


 
  • Helpful
Reactions: Doggydogworld
For now San Francisco rides are free — Waymo doesn’t yet have permission to charge a fee. And although it’s had permission since 2018 to ditch backup drivers in California, it has not yet done so. Rival Cruise has been testing some cars without backup drivers in San Francisco for nine months.

But McGoldrick said its trajectory here to fully autonomous paid rides will be quicker than that in Phoenix.

How long did they take in Phoenix ?
 
How long did they take in Phoenix ?

About 3.5 years.

But in Phoenix, they had no experience with ride-hailing and just starting their 4th Gen FSD. And covid delayed things a bit too. I doubt it will take that long in SF. Waymo now has the benefit of all the experience in Phoenix and the 5th Gen FSD which is much better than what they have in Phoenix. In the Autonomous Progress thread, I predicted "6-12 months" before they open it up wide to everybody with no NDA. I could be too optimistic. But I still think it will take much less than 3.5 years.
 

“We’re winding down our commercial lidar business as we maintain our focus on developing and deploying our Waymo Driver across our Waymo One (ride-hailing) and Waymo Via (delivery) units," a Waymo spokesperson said in a statement.