Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo

This site may earn commission on affiliate links.
Key sentence from the above:

“We supplement this with data collected from testing of our engineering fleet in non-autonomous mode, and from autonomous testing that is done in other settings, including on public roads in various other locations around the world.”
Why have you made this grand leap to assume non-autonomous mode is their known public FSD testing, and therefore “autonomous testing” must be something better that is previously undisclosed?

Non-autonomous data collection can be any level of engagement, including full driver control, and we already know Tesla claims “autonomous” on their currently scrutinized software.
 
There is no grand leap.
The context is that this is a reply to California DMV regarding regulating autonomous driving.
Lastly the focus of this discussion is NOT on the "non-autonomous mode" but rather on "autonomous testing that is done in other settings, including on public roads in various other locations around the world."
That quote is literally from 2017, so apparently Tesla has been hiding this more advanced autonomous testing around the world for years now. :rolleyes:
 
  • Like
Reactions: Bladerskb
That quote is literally from 2017, so apparently Tesla has been hiding this more advanced autonomous testing around the world for years now. :rolleyes:

lol now you know the lengths these people will go to twist and discredit others and to try to prop up Tesla. Notice how none of this was being said before.
It was wait for FSD. Then FSD Beta releases to the "PR spectacle" group and we see it sucks then it became there must be a secret better autonomous software out there which means Tesla has as many L4 as Waymo.

The real truth is:

Waymo has 30+ autonomous cars with no drivers ferrying public paying passengers
Tesla has 0 autonomous cars with no drivers ferrying public paying passengers

Waymo has done over 100k miles of autonomous rides with no drivers while ferrying public paying passengers
Tesla has done 0 miles of autonomous rides with no drivers while ferrying public paying passengers.
 
I’ve watched the original video where he said it.
If you ask everyone around the industry what is the biggest blocker right now.
They would tell you its not perception its prediction and planning. So its not just Amnon.

From Vovage's Oliver, Cruise, etc.

Even Huawei who is about to release a non-geofenced door to door L2 system that works everywhere in china said:
“In terms of the overall technical complexity of the industry, the difficulty of perception at the beginning is something that everyone knows. But now in terms of theory and technological maturity, the two problems of prediction and control are the real problems in the industry. What a lot of people don’t realize… is that planning and control has a greater impact on MPI (Miles per intervention) than perception.”

Tesla fans on the other hand will say its WRONG. Why? Because they know that Tesla STILL struggles with perception unlike the rest of the industry.
Tesla FSD struggles to know where the curbs are, to correctly estimate objects distance and dimensions on the road and some times not even detect them at all and would try to run into them.

71EGPy.gif
 
Last edited:
  • Like
Reactions: Matias
Kevin Vincent is the Associate General Counsel for Regulatory issues at Lucid.

30 some pages, I'll try to get to it today, but why do we need a proposal from a company that doesn't even have cars yet alone anything to show for autonomy?

O. Kevin Vincent just happens to be working at Lucid now. He has extensive experience with EVs and regulations from before working at Lucid. He is the author but Lucid has nothing to do with the report.

Here is O Kevin Vincent's bio:

"O. Kevin Vincent is the Associate General Counsel for Regulatory issues at Lucid. Prior to joining Lucid, he advised companies in the electric vehicle and automated vehicle industries on legal and regulatory issues. Mr. Vincent held in-house positions with two EV manufacturers, and more recently had his own practice advising multiple clients on EV and AV issues. Mr. Vincent's background includes serving as the Chief Counsel for the National Highway Traffic Safety Administration (NHTSA), Department of Transportation, in Washington, D.C. In that role, Mr. Vincent provided legal advice to the NHTSA Administrator and other DOT officials, including the General Counsel and Secretary of Transportation, on transportation safety and fuel economy issues. While at NHTSA, Mr. Vincent accelerated adoption by the automobile industry of "green" technologies, having managed the drafting of the Corporate Average Fuel Economy (CAFE) greenhouse gas reduction regulations jointly issued by NHTSA and the EPA, leading to adoption of new technologies to improve fuel efficiency. Mr. Vincent's efforts helped result in the historic CAFE/CHG standards for Model Years 2017-2025 light duty vehicles that doubled the fuel efficiency of our nation's vehicles."
 
lol now you know the lengths these people will go to twist and discredit others and to try to prop up Tesla. Notice how none of this was being said before.
It was wait for FSD. Then FSD Beta releases to the "PR spectacle" group and we see it sucks then it became there must be a secret better autonomous software out there which means Tesla has as many L4 as Waymo.

The real truth is:

Waymo has 30+ autonomous cars with no drivers ferrying public paying passengers
Tesla has 0 autonomous cars with no drivers ferrying public paying passengers

Waymo has done over 100k miles of autonomous rides with no drivers while ferrying public paying passengers
Tesla has done 0 miles of autonomous rides with no drivers while ferrying public paying passengers.


Waymo acknowledged that all "no human in driver seat" Waymo trips have remote operators/drivers that can take control of the car at any time.

So we do not know how many Waymo trips had interventions or how many of the 100k miles were really completely autonomous.

How complex were these Waymo 100K miles?
Is most of it limited access divided roads with few intersections?
How many times is it just repeated trips on the same easy roads it did before?
 
Waymo acknowledged that all "no human in driver seat" Waymo trips have remote operators/drivers that can take control of the car at any time.

Waymo has said that remote operators never control the car. They merely provide suggestions to the car. The car is still in full autonomous mode the entire time.

So we do not know how many Waymo trips had interventions or how many of the 100k miles were really completely autonomous.

All the Waymo trips were fully autonomous since the remote operators do not control the car. The Waymo is still in full autonomous mode even with a remote operator.
 
Waymo acknowledged that all "no human in driver seat" Waymo trips have remote operators/drivers that can take control of the car at any time.

So we do not know how many Waymo trips had interventions or how many of the 100k miles were really completely autonomous.

How complex were these Waymo 100K miles?
Is most of it limited access divided roads with few intersections?
How many times is it just repeated trips on the same easy roads it did before?


This has been repeated alot and as has been proven by the video above.
They can't joystick, steer, brake or accelerate the car in any way. They CAN'T take control.
If the car was barreling into a row of children, they can't do anything.

All they can do is tell it to pull over, confirm a request/decision or draw a short path for it.

All of the 100k miles were autonomous. When a human have to drive they don't count that.
No they are city streets they are not fully autonomous on the highways yet.
Here are 60+ videos you can watch:

 
I am just going by what Tesla told the CA DMV:

"For context, and as we’ve previously discussed, City Streets continues to firmly root the vehicle in SAE Level 2 capability and does not make it autonomous under the DMV’s definition. City Streets’ capabilities with respect to the object and event detection and response (OEDR) sub-task are limited, as there are circumstances and events to which the system is not capable of recognizing or responding. These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving path, unmapped roads. As a result, the driver maintains responsibility for this part of the dynamic driving task (DDT)"

Note the bold parts. Tesla did not say "FSD Beta is L2 because we don't trust it to handle some stuff". They said "FSD Beta is L2 because it is not capable of doing some OEDR".

There are at least two ways we could interpret this:

1) We literally do not have the code to handle these situations at all.

2) The system is not safe and reliable enough to handle these situations autonomously without a human in the loop, hence it is necessary that the system is made to be Level 2 (and has safeguards to assure human attentiveness).

Seen we’ve seen video proof that Teslas in L2 autonomy mode CAN successfully navigate construction zones, road debris, and adverse weather, I think (2) is the much, much more likely interpretation.
 
I’ve watched the original video where he said it.

It would be worth citing the original video and time stamp if anyone can be bothered to dig it up. Amnon says that object detection is solved but that other computer vision tasks like semantic segmentation (e.g. identifying free space) pose an ongoing challenge. The video was from a few years ago, but in the subsequent technical presentations Amnon has given (and I try to keep up on them all, although I’ve probably missed some), I’ve never heard him say, okay, we’ve now solved all those other computer vision problems too.

Moreover, even when he says object detection is “solved”, he is not saying we already have a system that is up to par with a human benchmark. He essentially just says (in my interpretation) the system is doing very well currently, the domain of objection detection is very well-understood and mature, and I’m confident we’re going to keep making progress until we surpass the human benchmark.

Several AV CEOs have seemingly fallen into this pattern: express confidence that AV will be solved in X many years by on extrapolating their improvement in performance metrics, then they hit a plateau, and have to do more R&D on their approach to break out of stagnating performance.

Basically, naïve linear (or exponential) extrapolation leads to overconfident predictions that don’t see various invisible performance ceilings on the way to human-level performance that have to be broken through with new approaches.
 
There are at least two ways we could interpret this:

1) We literally do not have the code to handle these situations at all.

2) The system is not safe and reliable enough to handle these situations autonomously without a human in the loop, hence it is necessary that the system is made to be Level 2 (and has safeguards to assure human attentiveness).

Seen we’ve seen video proof that Teslas in L2 autonomy mode CAN successfully navigate construction zones, road debris, and adverse weather, I think (2) is the much, much more likely interpretation.

3. We're obviously testing a system designed to provide L4 autonomy. But Elon ordered us not to submit disengagement data so our lawyers instead prepared this platter of steaming BS.
 
FYI, if you happen to be inclined to read anything I’ve written about Tesla, Waymo, or autonomy, I would recommend you ignore anything written before 2019. Not all of it is wrong, but some of it definitely is. I can’t even remember most of what I wrote 3 or 4 years ago.

Some of my old stuff I'm still proud of, but nobody's 1st article or 10th article is going to be as good as their 100th. The stuff I've written in 2021 is better than what I wrote in 2020, and what I wrote in 2020 is better than what I wrote in 2019.

Also, autonomous driving is a fast-changing field, as are its constituent fields of deep learning, reinforcement learning, imitation learning, computer vision, etc. Articles I wrote 3 or 4 years ago would be hopelessly out of date even if they were otherwise perfect at the time — which they were not. Not only have I learned a lot about autonomous driving in the last 4 years and corrected a lot of my mistakes, the subject itself has changed substantially.

If you’ve followed the research in these fields or watched technical presentations from the major companies like Waymo, Cruise, Mobileye, Tesla, or the now defunct Uber ATG (acquired by Aurora), you can notice significant changes in what approaches they’re using or exploring from one year to the next. Anything anyone's said or written about autonomy that’s even a year old risks being out of date.

These days, I’m not focused on defending my old work or having a track record of always being right. That’s ego bullshit. I just like robots, I like AI, I like learning, and I like making money. As an investor, I'd rather be wrong and rich than right and broke. Which would you rather have: the feeling of being right or a wheelbarrow full of cash?

So, I’m focused on trying to have a better understanding of what’s true as I learn, discuss, debate, research, talk to experts, commission papers, talk things out with friends, watch FSD videos, etc. If in that process, I find reasons to believe (for instance) that Waymo is close to full autonomy and Tesla isn't, I'll move all the money I have in Tesla into Alphabet. I can do it in one minute on my brokerage app. The less emotionally invested I am in feeling like I'm right, the more flexible I'll be about changing my mind, and my financial investments will be better served.

I'm not an unattached Buddha, but I've pretty much stopped writing publicly, I deleted Twitter a while ago, and on places like this I try to only engage with people who I feel like are gonna have productive conversations that actually go somewhere and uncover new truths.

Some people are like scary obsessed with me, to the point that I had a cyberstalker who hounded me for about a year. The only reason I was able to get him to stop was that he carelessly revealed his identity. He was super creepy and did stuff that genuinely terrified me. I'm so relieved he didn't cover his tracks or I might still be dealing with that today.

There are some deeply unwell people you'll encounter on the Internet who seem to pour a disturbing amount of their self-worth into being right about a specific topic and beating down people who they perceive as being wrong. Seeing how unhappy those people are and how much hate, anger, and bitterness they're filled with and how f***ed up their mental health is was one reason I decided I had to let go of the desire to be vindicated or get public acclaim.

It's like meeting Gollum and seeing where the path you're on ends up if you keep putting on the Ring. Engaging too much in the dark aspects of the Internet can really warp your mind and your heart and f*** with you in a lasting way.
 
Last edited:
Let's get back to talking about Waymo.

Brand new video from JJ Ricks.


00:00 Boarding
00:30 Note the seatbelt
00:41 Ride start/neighborhood driving
01:06 Stops to let bird pass
01:43 Rider support call
02:30 Four way stop, hesitating in case pedestrian crosses the road
04:28 Hard brake for unexpected behavior from other car
05:12 Four way stop
05:28 Traffic cones flying about on the rider screen
06:12 Unprotected right, really nicely done
09:13 Teensy bit of braking for the cyclist
10:03 Beginning of wacky detour
10:40 Unprotected left at stop sign
11:10 Unprotected left (technically, not really exciting though)
11:40 Actual unprotected left
12:30 Unprotected right
12:50 Driving with no road lines
13:20 Hesitating at residential unprotected left
13:55 Wacky pedestrian detection on the rider screen
14:46 Go go go go go!
15:11 Bonus stuck spot clip!
15:32 Right turn into stuck spot
15:45 Shifts into park but ride doesn't "end"
16:28 Ride finally "ends"
16:40 Starting ride again, planned route goes through the cones
16:55 Rider support call
17:12 Rider support exasperated sigh (sick of dealing with me at this point I bet)
17:28 Should not have said that
17:50 Car backs out

A couple cases of hard braking and a couple moments of hesitation but overall really good IMO. Waymo handles turns really smoothly. It avoided a bird. It was safe around pedestrians. It handled a road with no lane lines really well.
 
> I'd rather be wrong and rich than right and broke.
I use to trade stock options. Made a lot of money but didn't feel right about it. Just felt like professional gambling. I stopped doing it.
I'd rather be broke and helpful than wrong and rich, but of course rich and helpful is even better.
 
  • Helpful
Reactions: willow_hiller