Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Waymo is now exclusive L4 partner for Volvo!

This site may earn commission on affiliate links.
This is big news!

"Waymo is now the exclusive global L4 partner for Volvo Car Group, a global leader in automotive safety, including its strategic affiliates Polestar and Lynk & Co. International. Through our strategic partnership, we will first work together to integrate the Waymo Driver into an all-new mobility-focused electric vehicle platform for ride-hailing services. "

Full blog:
Waypoint - The official Waymo blog: Partnering with Volvo Car Group to scale the Waymo Driver

The blog mentions ride-hailing but I think this deal does open the door to Volvo and Polestar to deliver L4 cars to consumers that use the Waymo software. This deal could allow Waymo to really scale up and deliver more FSD to more people.

Not a big surprise, considering that Volvo are going to be using Android (Polestar 2)
 
  • Like
Reactions: diplomat33
Except, they haven't finished those.

If they did they wouldn't have to restrict them to a tiny suburb in Arizona with ideal weather, and still need backup remote drivers even then.

You keep ascribing far greater "finished" and "solved" claims to Waymo than there's any evidence to support.

If that were true- which it's obviously not- they wouldn't need to be geofenced.

If we're gonna just quote CEO claims directly contradicted by facts on the ground we'll be here for weeks citing stuff Elon said- don't think you wanna go down that rabbit hole :)

And Tesla is still asking for confirmation on stoplights and signs to make sure it works right in ALL areas before removing confirmations.

You don't get to make excuses for why they don't really have anything "solved" unless everyone gets to.

All levels are stepping stones to L5 depending on your approach.

Waymo is testing in 25+ cities across the US with over 10M autonomous miles. They are not restricted to just the "tiny suburb in AZ in ideal weather". That's just where they decided to launch a ride-hailing service because in that one area, their FSD is good enough to start removing the safety driver for some drives. Obviously, Waymo has not solved FSD but I do consider a feature like traffic light detection to be solved by Waymo.

I admit it might be difficult to determine the exact state of how good someone's FSD is with just public information. But if I am overestimating how good Waymo is, some of you are underestimating how good Waymo's FSD is.
 
Last edited:
  • Disagree
Reactions: mikes_fsd
But geofencing and mapping is part of their solution that made it easier to solve for the small area.
That is why I specifically used L5 in my comments, so that your bull$#!t fairy tales can fall by the wayside.

Waymo approach to L4 is not a stepping stone to L5. Geofencing will never work as a L5 solution and HD maps are going to be a nightmare to maintain for a L5 FSD solution -- pretty sure they are going to find that out the hard way.
 
Waymo is testing in 25+ cities across the US with over 10M autonomous miles.

But you said they had this "solved" and "finished"

Why would they need to still test?

Is it... because they do NOT have this solved and finished as you claimed?



Obviously, Waymo has not solved FSD .


Obviously?

Waymo has solved all the fundamentals of FSD.


they have solved the fundamentals of autonomous driving
 
  • Love
Reactions: mikes_fsd
But you said they had this "solved" and "finished"

Why would they need to still test?

Is it... because they do NOT have this solved and finished as you claimed?

Obviously?

They are still testing in order to improve reliability of said features.

You understand that there are basic perception tasks. For example, basic features like detecting lane lines, detecting a stop sign or reading a speed limit sign. That is what I call the fundamentals of FSD. But I hope you realize that perception is not the end of FSD. After perception, you still need to get the car to handle driving in a safe and reliable way which involves a lot more than just perception. You need planning and driving policy. For example, your car needs to figure out when to change lanes safely, when to stop, when to go, when to wait until the path is clear before completing an unprotected left turn, predicting the path of a cyclist so you don't make a turn that will risk a collision etc...

So when I say that that the fundamentals are solved but that FSD is not solved, I mean that the basic perception tasks are done but that the "total package" of the car safely driving on its own in all cases is not solved yet because maybe the car does not handle a complex situation perfectly. Get it?
 
  • Disagree
Reactions: mikes_fsd
It almost feels like @diplomat33 wishes there was a special FSD level just for Waymo that allows them to claim "FSD is solved with geo-fencing".

HA HA. Yes, it's called L4. I did not make it up. The Society of Automotive Engineers (SAE) says that cars that can fully self-drive but only in a specific geographical area is a type of FSD called L4. It's different than L5 but it is still FSD because the car can full self-drive.

A car that can fully self-drive in the city of LA = L4

A car that can fully self-drive anywhere in the US = L5

Both are FSD because in both cases, the car is fully self-driving with no human input, just one works in one area, while the other works in a much broader area.

Get it?
 
Yes, it's called L4. I did not make it up.
L4 is not a solved FSD solution. It "could be" if a real L5 car is handicapped/limited in software to a geographical area, but that is NOT where Waymo is at.

My point is you want to claim that FSD is solved for Waymo by trying to hide by the labels of L4 and L5.
Waymo does not have FSD solved, they fake L4 by mapping the $#!t out of the small area where they allow it drive. (like a train on rails with some bells and whistles)
And THAT solution cannot become L5.

We've already agreed in another post (specifically pointing out) that we will grade everyone who claims to have "solved FSD" the same...
FSD has to be available:
  • Anywhere
  • Anytime
  • to anyone (i.e. no pre-approval or other bureaucracy)
 
L4 is not a solved FSD solution. It "could be" if a real L5 car is handicapped/limited in software to a geographical area, but that is NOT where Waymo is at.

My point is you want to claim that FSD is solved for Waymo by trying to hide by the labels of L4 and L5.

I never said that Waymo solved FSD. I said they solved the fundamentals of FSD and have L4 FSD.

Waymo does not have FSD solved, they fake L4 by mapping the $#!t out of the small area where they allow it drive. (like a train on rails with some bells and whistles)
And THAT solution cannot become L5.

No, Waymo has not solved all of FSD. But they are not faking L4. They have real autonomous driving. Waymo has camera vision perception and the car is driving completely on its own, not locked to a pre-determined path, making its own decisions of what route to take, when to go, when to stop, how to go around an obstacle etc.. It is real autonomous driving and yes, it is a path to L5.
 
  • Disagree
Reactions: mikes_fsd
They are still testing in order to improve reliability of said features.

If their solution isn't reliable then they haven't really "solved" the problem, right?

I don't think you can go "Well, this sometimes works!" and call it a solution to a problem.

NoA works the VAST majority of the time with no user intervention- but I don't claim Tesla has "solved" highway driving.... (Elon does- but that's another topic)


T
You understand that there are basic perception tasks. For example, basic features like detecting lane lines, detecting a stop sign or reading a speed limit sign. That is what I call the fundamentals of FSD.

Ok. Tesla does all that too (Well, not reading signs in released code, but we know it "can" do it and is coming soon)

Again, Waymo can do this stuff under specific conditions in specific places. As the autonomy video showed- so can Tesla.

Waymo is aiming for a solution that also relies on HD maps and will only work in places with such maps. Tesla is working on one that works "anywhere" without the HD maps.

Neither has "solved" it- which is why neither has a fully deployed retail product in wide use.
 
  • Like
Reactions: mikes_fsd
If their solution isn't reliable then they haven't really "solved" the problem, right?

I don't think you can go "Well, this sometimes works!" and call it a solution to a problem.

NoA works the VAST majority of the time with no user intervention- but I don't claim Tesla has "solved" highway driving.... (Elon does- but that's another topic)

Ok. Tesla does all that too (Well, not reading signs in released code, but we know it "can" do it and is coming soon)

Again, Waymo can do this stuff under specific conditions in specific places. As the autonomy video showed- so can Tesla.

Waymo is aiming for a solution that also relies on HD maps and will only work in places with such maps. Tesla is working on one that works "anywhere" without the HD maps.

Neither has "solved" it- which is why neither has a fully deployed retail product in wide use.

Again. I think we need to be clear between "FSD" from "fundamentals of FSD".

Has Waymo solved city driving? No. And I agree that neither Tesla nor Waymo have solved FSD. But has Waymo solved a specific feature like "traffic light response" at least in the areas that they operate in? Yes, absolutely. Case in point, Waymo does not require steering wheel nags or stalk confirmations on any of their drives. That tells me that Waymo is confident in their features in the geofenced areas where they drive. But I grant you that it is easier to solve something for L4 than it is to solve it for L5.

Remember, it is possible to solve a specific subset, like "traffic light response" without having solved all of FSD.
 
Unless you count the remote backup driver that still has to help sometimes.

But then the autonomy day demo showed Tesla can handle that too as long as you narrow the operational domain enough- without even needing the remote driver.

Huh you are comparing a demo that took dozens of attempts to get right and they only used the outtake with no failure. A demo that had a driver behind the wheel ready to take over at any moment? A deno that involved practically no one on the road.

Versus the thousands of rides that Waymo has given in Phoenix with no one in the driver seat?

Seriously it's not illegal to think...
 
Huh you are comparing a demo that took dozens of attempts to get right

No, it did not.

It took one.

You might be confusing the autonomy day video (done in 1 take) with the 2016 video (that took a TON of takes and about 500 miles of driving attempts)

We know this because Tesla had to report autonomous testing miles for both years.

Tesla report to CA in 2019 said:
In April, we operated one vehicle in autonomous mode to record one demo run on a 12.2-mile route around Tesla’s Palo Alto headquarters. The route covered surface streets and highways. We did not experience any autonomous mode disengagements during this run and, as a result, do not have any disengagements to report for Reporting Year 2019.”




Ignored the rest of the post since it's based on false premises.
 
  • Love
Reactions: mikes_fsd
No, it did not.

It took one.

You might be confusing the autonomy day video (done in 1 take) with the 2016 video (that took a TON of takes and about 500 miles of driving attempts)

We know this because Tesla had to report autonomous testing miles for both years.

Ignored the rest of the post since it's based on false premises.

Still. There is a big difference between doing 1 drive of 12 miles on a pre-planned route versus doing thousands of drives with no safety driver all across town on any route that the passenger asks for. Waymo has consistently shown more impressive FSD than Tesla.
 
  • Disagree
Reactions: mikes_fsd
Things are starting to align. This was what was needed to fully realize this goal, doing piecemeal like a lot of car companies were trying to do was never going to work, at least for the first couple of years. You either integrate from the ground up or you don't do it at all.

Volkswagen going with Mobile Eye
Mercedes group have decided to go with Nvidia hardware and software
GM is doing their own hardware and software
Nissan, Volvo are going with Waymo
Tesla their own hardware and software

Hyundai has apparently gone with Aptiv to deploy robotaxis according to this Elektrek article.
Waymo and Volvo team up for all-electric robotaxis and Level-4 Polestar EVs - Electrek
 
  • Informative
Reactions: Bitdepth
No, it did not.

It took one.

You might be confusing the autonomy day video (done in 1 take) with the 2016 video (that took a TON of takes and about 500 miles of driving attempts)

We know this because Tesla had to report autonomous testing miles for both years.






Ignored the rest of the post since it's based on false premises.

Wrong. Even NHTSA knew the entire thing was BS. They sent a letter to Tesla asking them what makes the video autonomous but the drives that they gave to attendees not autonomous (and hence not reported) and they simply just hand waived it away.

Its bullshit, there's no difference between that video and any other Tesla video online.
You can simply record your success and claim that's all you did. NHTSA knew that and that's why they sent that letter.

This letter is now public. Its on twitter but i can't find it at the moment.

This is proven by the fact that alot of people had disengagements in their ride which took similar route.
If it was truly a one shot take. Then the overwhelming result should be that almost all attendee rides would be flawless without disengagement.

Again i will repeat.

You are comparing a demo video based on a single success take on exactly one route (which failed in most of the drives later on the same single route) on mostly empty ghost streets with a human driver ready to take over in an instant.

Versus thousands of rides across the city given to real people WITH NO ONE IN THE DRIVER SEAT and NO ABILITY TO TAKE OVER PERIOD.

You ever heard of logic?
 

I mean, it's not though.


YOU claimed, wrongly:

a demo that took dozens of attempts to get right and they only used the outtake with no failure.

That's factually false- as evidenced by their report. It took ONE attempt to get right, no outtakes.

I get you're pouty about being wrong, but that's no reason to double down on dishonesty....which is what you did further on-



Even NHTSA knew the entire thing was BS. They sent a letter to Tesla asking them what makes the video autonomous but the drives that they gave to attendees not autonomous (and hence not reported) and they simply just hand waived it away.

What letter?

I ask, because the NHTSA doesn't regulate this stuff. At all.

The reporting was done to the state of CA- not a national agency.

Hell I even pointed that out to you in my last post but you STILL screwed it up.


Its bullshit, there's no difference between that video and any other Tesla video online.

Really?

Can you show me all the other videos where the car drives 12 miles with no interventions, including handling turns at intersections?

Go ahead... we'll wait....

Well, ok, we won't, since that doesn't exist.



You can simply record your success and claim that's all you did. NHTSA knew that and that's why they sent that letter.

Again- what letter?

Again I ask because this isn't regulated by the NHTSA in the first place.


It's like you have literally no idea WTF you're talking about.


This letter is now public. Its on twitter but i can't find it at the moment.

Of course you can't.

Because you imagined it.

because that agency doesn't even have authority over the situation.

Because you're making up nonsense.


NO ABILITY TO TAKE OVER PERIOD.


Waymos cars all have remote folks that can send commands to the car when they get into situations they can't understand (and that ABSOLUTELY happens any number of times)


You ever heard of logic?


You ever heard of telling the truth?

Give it a shot sometime!
 
Last edited:
  • Like
Reactions: mikes_fsd
YOU claimed, wrongly:
That's factually false- as evidenced by their report. It took ONE attempt to get right, no outtakes.
I get you're pouty about being wrong, but that's no reason to double down on dishonesty....which is what you did further on-

Which is why it failed on almost every ride later on, on the same exact route? Tesla fanboy logic at its finest.

Again- what letter?
Again I ask because this isn't regulated by the NHTSA in the first place.
It's like you have literally no idea WTF you're talking about.

Of course you can't.
Because you imagined it.
because that agency doesn't even have authority over the situation.
Because you're making up nonsense.

I meant CA DMV..but you already knew that.

https://twitter.com/Tweetermeyer/status/1267588148202057728

EZdgHMJVcAIIcpR


Waymos cars all have remote folks that can send commands to the car when they get into situations they can't understand (and that ABSOLUTELY happens any number of times)

Waymo remote drivers have no ability to take over the car at all let alone in an instant nor is there an emergency stop button, they don't joystick or pedal the car. If a failure happens, the car simply crashes.
The Tesla driver on the other hand is clearly watching the road and simply grabs the wheel and/or applies gas/brake pedal, disaster avoided.

One is a driverless L4 car the other is a L2 driver assistance.
 
Last edited:
Which is why it failed on almost every ride later on, on the same exact route?

<citation required>

I meant CA DMV..but you already knew that.

I mean, at least ONE of us did :)




Uh... not sure what you THOUGHT this said- but this doesn't support your claim at all.

It makes clear that the filmed drive was different from the ones the customers went on, those were L2 drives, not L3, and used only a "similar" but not identical route, in different vehicles than the demo film.

So, in short and in fact, it on multiple fronts states the opposite of your claim.


Thanks for debunking your own nonsense though! well done!


One is a driverless L4 car the other is a L2 driver assistance.


Once again we know you're wrong, because you proved it to us

the letter you cite states the Tesla demo drive that was filmed was L3, not L2....and mentions their plans (over a year ago now) to do additional L3/L4 testing (though likely outside California)



I'd say quit while you're behind, but this is hilarious...keep digging man! You'll reach giga shanghai in no time!
 
Last edited: