Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
The tech will get better and Waymo will get enough miles to prove safety. Then, they will deploy driverless rides in the rain.

And Tesla has yet to deploy any driverless rides on a perfect sunny day. ;)
Yes, but Teslas can automatically steer, turn, brake, accelerate with driver supervision - anywhere, sunny, cloudy, or rainy. ;)
 
Will that hurt Waymo's (hypothetical) profits when the weather is good? Will consumers be so annoyed when they're not available that they'll stop using them altogether?
Obviously Waymo plans to get to their desired level of safety when operating in the rain. It's also possible they just haven't driven enough miles in the rain yet to estimate how safe the system is.
So to train, in the lane, needs no human participation,

But to train in the rain, needs divine precipitation!

(Though we train, in the main, with computer simulation)

Thus we explain, very plain, our current situation.

You may obtain, at our domain, Way-mo great information!
 
Yes, but Teslas can automatically steer, turn, brake, accelerate with driver supervision - anywhere. ;)

automatically steer, turn, brake, accelerate with driver supervision - anywhere < geofenced driverless rides.

Besides, the goal is to remove driver supervision. You don't get any points for automatically steering, turning etc with driver supervision.
 
automatically steer, turn, brake, accelerate with driver supervision - anywhere < geofenced driverless rides.

Besides, the goal is to remove driver supervision. You don't get any points for automatically steering, turning etc with driver supervision.
I think it's clear that you can have very very limited, geofenced fully automated vehicles. I prefer a vehicle where I can use the semi-autonomous driving features that will actually exist once the beta gets into the wild.

Tesla's goals are, in the here and now, to minimize the need for situations requiring driver supervision. Full automation is Tesla's aspiration.
 
  • Like
Reactions: FSD_Scribe
...You don't get any points for automatically steering, turning etc with driver supervision.
On this very specific point, if there are no interventions then it doesn't matter whether there is a supervising or backup driver present. You get the points towards proof of unsupervised operation.

If you drive 100k miles, ready to intervene but didn't need to, then those 100k miles can be properly counted as demonstration of successful autonomy. However (of course), on a given software release you can't cherry-pick miles from successful no-intervention drives, but then throw out unsuccessful ones by claiming "it was just a supervised test run".

It might make better PR copy to brag about miles with no one in the driver's seat, but to regulators it should make no difference whether there was a backup driver for some or all miles being presented.
 
On this very specific point, if there are no interventions then it doesn't matter whether there is a supervising or backup driver present. You get the points towards proof of unsupervised operation.

If you drive 100k miles, ready to intervene but didn't need to, then those 100k miles can be properly counted as demonstration of successful autonomy. However (of course), on a given software release you can't cherry-pick miles from successful no-intervention drives, but then throw out unsuccessful ones by claiming "it was just a supervised test run".

It might make better PR copy to brag about miles with no one in the driver's seat, but to regulators it should make no difference whether there was a backup driver for some or all miles being presented.

Unfortunately, Tesla has not released any disengagement data. So we don't know if their supervised driving is proof of successful autonomy or how close Tesla is to successful autonomy. Certainly, I doubt FSD beta has a disengagement rate of 100k miles per disengagement.

Waymo does release their annual disengagement data for supervised autonomy (with safety driver) to show evidence of their autonomous driving.
 
Last edited:
Unfortunately, Tesla has not released any disengagement data. So we don't know if their supervised driving is proof of successful autonomy or how close Tesla is to successful autonomy. Certainly, I doubt FSD beta has a disengagement rate of 100k miles per disengagement.

Waymo does release their annual disengagement data for supervised autonomy (with safety driver) to show evidence of their autonomous driving.
I wasn't specifically referring to Tesla or to anybody else, but generally commenting about whether miles can count if there is a backup driver.

But since you brought it up, I wouldn't expect Tesla to release such data until they wish to claim an unsupervised package. If I were in their shoes I'd be doing exactly what they are doing (regarding data and MVD reporting I mean - not regarding the way they price, promote and license "FSD Capability"). Continue to gather in-house and customer data in L2 mode, data-mine for targeted situations, run new code in shadow mode etc. No purpose in looping-in the regulators until they have a successful release candidate. Waymo has deployed L4 and presumably must disclose in the state(s) they wish to operate.

It's not like Tesla is being unduly secretive, though that seems to be a common implication here. The whole YouTube FSD beta paradigm is a surprisingly open look into their unfinished product, for better or worse. They are giving everyone, including competitors and some extremely ardent & tireless critics, every chance to see the good (often flying by at 5x or so) and plenty of the bad with detailed discussion, retries and frustrations. I'm not sure I've seen anything quite like it since the live coverage of the early US space program, and that was no private enterprise.
 
I think it's clear that you can have very very limited, geofenced fully automated vehicles. I prefer a vehicle where I can use the semi-autonomous driving features that will actually exist once the beta gets into the wild.

Tesla's goals are, in the here and now, to minimize the need for situations requiring driver supervision. Full automation is Tesla's aspiration.

Since you are talking specifically about tesla. The issue is that Tesla hasn't been able to do even your so called "very very limited, geofenced fully automated" and it isn't for lack of trying. Heck they can't even do a one off cross country drive let alone that. Just like in the past when people excuse Tesla for failing to meet any goals and change narrative depending on what they can do. You will do a mental gymnastic if in 3 years Tesla is able to get a L4 working in a suburb city in sunny weather.

What people are trying to explain to you and you will refuse to see is that geofenced L4 even in just surbub city and in good weather is still harder that L2 in an entire country. Because Tesla isn't the only one doing L2 in an entire country (Huawei in Q4 2021, then soon Nio and Xpeng in 2022), heck Mobileye is doing L2 spanning the entire globe.

L2 everywhere is easier than L4 somewhere.
 
I don't think any autonomous vehicle would be expected to handle flash floods.

Having grown up in Dallas, "flash floods" is the multiple time per year occurrence when a lot of water in a short time causes deep water, usually under a highway overpass, that people drive into and then die by drowning when their car gets carried away in the current. It happens repeatedly because people think it won't happen to them, and because the river of water really doesn't look very deep (and that's to a human, not a neural net).

Autonomous vehicles absolutely should "handle" flash floods. Maybe they simply decide "I'm designed for Arizona and we'll just power down until it stops raining", and maybe the flash flood is infrequent enough that this "handling" is done by a human operator at headquarters manually broadcasting that shutdown. Sure, they should eventually have enough smarts to evade it on their own. But to my mind, not "handling" flash floods means that you're oblivious to this and are willing to risk drowning your passenger by driving into what looks like a puddle.
 
Having grown up in Dallas, "flash floods" is the multiple time per year occurrence when a lot of water in a short time causes deep water, usually under a highway overpass, that people drive into and then die by drowning when their car gets carried away in the current. It happens repeatedly because people think it won't happen to them, and because the river of water really doesn't look very deep (and that's to a human, not a neural net).

Autonomous vehicles absolutely should "handle" flash floods. Maybe they simply decide "I'm designed for Arizona and we'll just power down until it stops raining", and maybe the flash flood is infrequent enough that this "handling" is done by a human operator at headquarters manually broadcasting that shutdown. Sure, they should eventually have enough smarts to evade it on their own. But to my mind, not "handling" flash floods means that you're oblivious to this and are willing to risk drowning your passenger by driving into what looks like a puddle.

You misunderstand. I was probably not very clear. By "not handling", I meant that AVs should not drive through a flash flood. In other words, we should not expect AVs to be able to drive in a flash flood. I was responding to the other poster who asked if Waymo robotaxis are able to drive in flash floods. I was replying that AVs should not be expected to drive in a flash flood. AVs should absolutely be able to recognize when the road is not drivable, like in the case of a flash flood, and safely avoid it or safely pull over. So, yes, L4 or L5 AVs need to be smart enough to avoid a flash flood when it is not safe.
 
  • Like
Reactions: TunaBug
And this, other than puffery and feel good advertising, adds what to this discussion?

Not every post has to be on the level of "OMG! This will solve FSD!". It's a nice little example of how Waymo is using their real autonomous driving to benefit others in a meaningful way. This thread is dedicated to Waymo so I thought I would share something nice that they are doing.

I guess I forgot that you are the forum police for what is worthy to be posted. LOL.
 
Not every post has to be on the level of "OMG! This will solve FSD!". It's a nice little example of how Waymo is using their real autonomous driving to benefit others in a meaningful way. This thread is dedicated to Waymo so I thought I would share something nice that they are doing.

I guess I forgot that you are the forum police for what is worthy to be posted. LOL.
Yes, but posting marketing materials seems, um, redundant to your marvelously effective Waymo cheerleading.
 
Might be a while until we get some user feedback on how the SF deployment is going. Sharing photos or videos of the car/ride seems to be forbidden according to the Terms of Service:


We might get a couple leaks but we won't get meaningful user feedback until the NDA is lifted. It was the same with Chandler. Waymo had an early rider program under strict NDA at first. Later, they lifted the NDA and opened it up to everybody and then we got lots of videos at that point (Thanks @JJRicks ).

So the question is how long with the Trusted Tester program last and when will Waymo lift the NDA? In the other thread, I predicted "6-12 months". I do think it will be shorter than Chandler, just because Waymo has more experience now. They are not starting at the same point as they were when they were launching the ride-hailing in Chandler.