Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
That's why I'm not forgetting it- same thing applies to both.

Vision can do the task.

Humans do that task with vision right now.

So does Teslas vision-only system.

Of course vision can. That's not the issue. Capability is not enough. What is the accuracy? You need accuracy for all tasks and in all conditions.

Currently, vision is not capable of that level of accuracy in all perception tasks and in all conditions simultaneously:

As of today, a single sensor is not capable of simultaneously providing reliable and precise detection, classifications, measurements, and robustness to adverse conditions. "Safety First for Automated Driving" (p 47)

And it would seem that FSD Beta is not calculating distances and velocities accurately enough since it did not avoid the car running a stop sign in this video:


This is only relevant if you rely on mm-accurate HD maps. Which of course Tesla does not-- and doing so makes scaling your system incredibly expensive and difficult (see again the failure of Waymo to make it out of one tiny suburb with this issue).

No. HD maps do not make scaling expensive and difficult.

Here is HD maps being created automatically:


And, Waymo does not have a problem driving autonomously outside of Chandler. They drive millions of miles outside of Chandler. HD maps are not the reason why Waymo has not deployed a public service outside of Chandler yet. They have not deployed public ride-hailing in other areas yet because they are working to make the planning and driving policy safer in edge cases. And Waymo is finishing validation for the new 5th Gen hardware.

We've been over this and you still refuse to accept facts.
 
Last edited:
Without ML you don't know it's a car.
Well technically there was computer vision before ML. Of course it didn't work as well as ML.
A large enough bag or tarp would be an example. In fact most of the arguments about sensor fusion cite cameras as being quite important for figuring out which large things LIDAR might not understand are NOT things you necessarily want to slam on the brakes for.
Yeah but it might be safer to just slam on the brakes. Keep in mind that self-driving cars might be viable even with erroneous braking (CA data shows that they get rear ended a lot!). The new Tesla vision system still has phantom braking.
What's funny is that VIDAR has the same sensor fusion issues that everyone here worries about so much.
This is only relevant if you rely on mm-accurate HD maps. Which of course Tesla does not-- and doing so makes scaling your system incredibly expensive and difficult (see again the failure of Waymo to make it out of one tiny suburb with this issue).
LIDAR can detect curbs without HD maps... I've seen a few FSD beta users hit curbs (and many more disengage to avoid them).
 
Of course vision can. That's not the issue. Capability is not enough. What is the accuracy? You need accuracy for all tasks and in all conditions.

You need some level of accuracy.

But as pointed out- you don't need mm accuracy for, say, "stopping distance"

because you're always going to want to stop considerably more than mms away from an object you don't want to hit.

From what I've seen from Green so far, vision accuracy seems quite good when directly compared to the distance and speed data radar is providing at least.

We obviously don't have the data to know how it compared to LIDAR.

Tesla does though and they seem to be moving ahead with vision only anyway.


Currently, vision is not capable of that level of accuracy in all perception tasks and in all conditions simultaneously:



And it would seem that FSD Beta is not calculating distances and velocities accurately enough since it did not avoid the car running a stop sign in this video:

27+ minute video, hitting play starts at the intro- you're gonna need to give a timestamp or something


No. HD maps do not make scaling expensive and difficult.


Suggests $ spent for HD maps in 2020 is already north of 1.5 billion a year, and will be more than 10x higher by 2028.

Sounds expensive to me.

Also mentions:
"The cost incurred in creating HD maps is expected to hamper deployment in autonomous vehicles in the initial phase of the during the forecast period"


Showing me demos of "self mapping" that "will fix this some day in the future" aren't any more compelling than showing me demos of "self driving" with such claims.


And, Waymo does not have a problem driving autonomously outside of Chandler. They drive millions of miles outside of Chandler. HD maps are not the reason why Waymo has not deployed a public service outside of Chandler yet. They have not deployed public ride-hailing in other areas yet because they are working to make the planning and driving policy safer in edge cases. And Waymo is finishing validation for the new 5th Gen hardware.

We've been over this and you still refuse to accept facts.

Unclear on these facts.

Which facts, specifically, make their planning and driving policy "safe enough" in Chandler AZ, but nowhere else in the world?





LIDAR can detect curbs without HD maps... I've seen a few FSD beta users hit curbs (and many more disengage to avoid them).

I'd bet that's more an issue of the cameras not being able to see low enough around the car than anything else (and I've mentioned for years I think adding low/side viewing cameras on the front/rear fenders would be a REALLY good idea for Tesla FWIW- also enable a real 360 parking view)
 
  • Like
Reactions: powertoold
You need some level of accuracy.

But as pointed out- you don't need mm accuracy for, say, "stopping distance"

because you're always going to want to stop considerably more than mms away from an object you don't want to hit.

From what I've seen from Green so far, vision accuracy seems quite good when directly compared to the distance and speed data radar is providing at least.

No, you don't need mm accuracy for stopping, but you do need high level accuracy for a lot of tasks. There are cases like curbs where if you are just a few inches off, you will hit the curb. There are also cases like Chuck's unprotected left turns with fast moving cross traffic where you better be very accurate with both position and velocity. The bottom line is that more accuracy is always a good thing. Better to be more accurate than you need than not accurate enough.

27+ minute video, hitting play starts at the intro- you're gonna need to give a timestamp or something

Sorry. Start at 5:30.

Unclear on these facts.

Which facts, specifically, make their planning and driving policy "safe enough" in Chandler AZ, but nowhere else in the world?

Because Waymo has the data in Chandler to trust that it is safe enough in Chandler. Other cities will have new edge cases, not encountered in Chandler. That is why Waymo is testing their autonomous driving in other cities, and collecting more data, to make sure their autonomous driving is safe enough in those other cities. The Waymo Driver might be safe enough in other cities, but they need the data to prove it before public deployment.
 
No, you don't need mm accuracy for stopping, but you do need high level accuracy for a lot of tasks. There are cases like curbs where if you are just a few inches off, you will hit the curb

Yes, and I already addressed this is a specific area I think Tesla would be very very well served by adding a couple of side/low cameras that don't exist today.

THAT said- most places legally you need to park within 12 inches of a curb. I don't think we know how accurate, exactly, Teslas vision distance measurements are- but I expect they'll be plenty good enough to manage better than a full foot off in production.



. There are also cases like Chuck's unprotected left turns with fast moving cross traffic where you better be very accurate with both position and velocity.

I wouldn't expect differences of a few mm to matter here either-- if an unprotected left is that close you shouldn't be trying to make that tiny of a gap.

Velocity measurement is important- but again from what we've seen from Green so far the velocity estimates between vision and radar have been VERY close already- and this is a system still in pre-release state so will likely keep getting more and more so.


The bottom line is that more accuracy is always a good thing. Better to be more accurate than you need than not accurate enough.

But since cost and complexity are real things, being more accurate than you need to be isn't necessarily better than being accurate enough to do the task.




To be clear, I'm not even saying Teslas current HW and approach WILL be,

I'm saying if their thinking that they can "solve" vision to provide accurate enough speed, distance, and recognition with just vision is correct- then their solution will be much simpler, much cheaper, and far more scalable, and would have no need whatever for LIDAR since they've already gotten the data they need without it.




Sorry. Start at 5:30.


I did... I see a non-tesla fail to stop and nearly hit the Tesla... I didn't see the Tesla miss a stop sign at all. Watching several more minutes where it correctly stopped at multiple stop signs... so not sure what you were trying to show here.



Because Waymo has the data in Chandler to trust that it is safe enough in Chandler. Other cities will have new edge cases, not encountered in Chandler. That is why Waymo is testing their autonomous driving in other cities, and collecting more data



Waymo began testing without a driver in Chandler in Oct 2017. They launched the consumer-facing service barely a year later- Dec 2018- initially with safety drivers, but by Nov 2019 (~2 years after they began testing it) they were doing driverless in Chandler.

Waymo has been testing in the SF Bay area since 2009. They'd done significant testing in a couple DOZEN other cities by the time they launched driverless in Chandler. 0 of them have driverless today.

The things Chandler has those other places don't-

VERY simple roads that don't change often (need to have detailed HD maps that are updated a lot is gone)

VERY simple weather (generally dry and sunny more than 90% of days)

Seems like they're stuck in one of those local maximums Elon talked about a while back.
 
I'm saying if their thinking that they can "solve" vision to provide accurate enough speed, distance, and recognition with just vision is correct- then their solution will be much simpler, much cheaper, and far more scalable, and would have no need whatever for LIDAR since they've already gotten the data they need without it.

Yes, that is Tesla's "gamble". If Tesla is right, then they will have a cheap, scalable solution. But if they are wrong then they will have to go back and add the sensors anyway and will have lost a lot of time when they could have added the sensors from the start. We shall see. So far, Tesla's FSD Beta is improving and is doing self-driving but it is also encountering issues and has not achieved the reliability to remove driver supervision yet. Tesla could hit a "local max" where vision-only can do good self-driving but can't do some situations reliably and still requires driver supervision and therefore falls short of true FSD.

I did... I see a non-tesla fail to stop and nearly hit the Tesla... I didn't see the Tesla miss a stop sign at all. Watching several more minutes where it correctly stopped at multiple stop signs... so not sure what you were trying to show here.

No. the other car failed to stop at a stop sign and was going to hit the Tesla. The other car was at-fault. But good FSD should avoid accidents where the other is at-fault as much as possible. I am saying that FSD Beta should have done a better of job of detecting the other car was going to hit it and taken evasive action.

Waymo began testing without a driver in Chandler in Oct 2017. They launched the consumer-facing service barely a year later- Dec 2018- initially with safety drivers, but by Nov 2019 (~2 years after they began testing it) they were doing driverless in Chandler.


Waymo has been testing in the SF Bay area since 2009. They'd done significant testing in a couple DOZEN other cities by the time they launched driverless in Chandler. 0 of them have driverless today.

The things Chandler has those other places don't-

VERY simple roads that don't change often (need to have detailed HD maps that are updated a lot is gone)

VERY simple weather (generally dry and sunny more than 90% of days)

Seems like they're stuck in one of those local maximums Elon talked about a while back.

I don't see Waymo necessarily stuck at a local max at all. I see Waymo doing a lot of testing like everybody is doing and making sure FSD is safe before deployment. It takes a lot of time and data to actually do real FSD the right way. Waymo is also transitioning from the 4th Gen to the 5th Gen which takes time to properly validate. When Waymo is ready to deploy in more areas, they will. But we shall see. If Waymo fails to deploy the 5th Gen in the next 2-3 years, then I might agree with you.
 
Last edited:
That wasn't the point though- you were touting the ability to see in complete darkness... which is a condition you ought never drive in to begin with.
That is the EXACT point. The ability to see in complete darkness helps with areas your head lights can't see that you still need to perform the entire DDT on. I'm not talking about some glorified lane keeping system. It matters when making turns, backing up, handling u-turns and cross traffic, driving in neighborhoods with no street lambs, in emergencies, in inclement weather where visibility is even lower even with headlights.
You seem unclear on what the word blind means.

"I see something but do not know what it is" is not blind.
It doesn't see at all. You are trying to argue with something you have no clue about.
So it gives you.... a bunch of numbers.

Same as you just admitted a camera does.
Your ignorance is showing here. Its not even close as cameras.
Lidar gives you 3D points. Its a huge difference than cameras. Its the entire point.
With 3D points you can run control theory and drive a car, robot, drone, etc.
This is basic robotics 101. You have no clue what you are talking about.

Sure it does.

With ML.

Just like vision.

In both cases you get a bunch of data from a sensor, but need ML to know what you're "looking" at.


There's Waymo discussing some of the ML tasks they're using on LIDAR data for example.
You are clueless. Why don't you try educating yourself first on Robotics 101 before trying to talk about things you have no idea about.
 
Last edited:
The point is that with LIDAR you don't need to use ML to detect objects. You can write regular old procedural code to prevent you from running into another car. Name a car sized object that you might want to run into, you don't always need to know what you're looking at.
Obviously you can feed LIDAR data or any other data into NNs too.
Just think of LIDAR as a superior more expensive form of the VIDAR that Tesla is working on.

These people are so clueless its hard to have a basic 1+1 discussion with them.
And they refuse to be educated which is the worst part. Its so sad.
 
What's funny about all of this is that we're taking pictures of processed LIDAR outputs, running them through our own human brains, and concluding, "hey! LIDAR can obviously see that vantablack car!"

View attachment 693878

my goodness you are so uninformed its not even funny.
How about you go take some basic robotics 101 course online. So many free ones at couresa, udacity, etc.
Educate yourself.
 
Definitely. The narratives have changed, but there's always been a broad effort to hurt Tesla by spreading FUD online.

Shorts have taken huge losses over the years.
The auto and ADAS OEMs are still having a hard time competing with Tesla.
Posting garbage online to TMC (twitter and other forums) to get the media attention as "news" is cheap and easy.
Building EVs, battery plants, ADAS HW & SW, ...etc... is hard and expensive.

Do people get paid to spew misinformation? Sometimes I wonder.
 
Definitely. The narratives have changed, but there's always been a broad effort to hurt Tesla by spreading FUD online.

Shorts have taken huge losses over the years.
The auto and ADAS OEMs are still having a hard time competing with Tesla.
Posting garbage online to TMC (twitter and other forums) to get the media attention as "news" is cheap and easy.
Building EVs, battery plants, ADAS HW & SW, ...etc... is hard and expensive.
285D1F39-909F-40A8-ACF5-EE784C91D8BC.jpeg
 
So far, Tesla's FSD Beta is improving and is doing self-driving but it is also encountering issues and has not achieved the reliability to remove driver supervision yet.
Funny you don't hold your favorite company to that same exact standard. Except in Chandler, AZ, which is hardly NYC or SF (the latter apparently now being driven by your favored company with safety drivers).
 
Because Waymo has the data in Chandler to trust that it is safe enough in Chandler. Other cities will have new edge cases, not encountered in Chandler. That is why Waymo is testing their autonomous driving in other cities, and collecting more data, to make sure their autonomous driving is safe enough in those other cities. The Waymo Driver might be safe enough in other cities, but they need the data to prove it before public deployment.
If it takes years of test driving and running a tiny service to gather data in each new city then Waymo might as well give up now.

The things Chandler has those other places don't-

VERY simple roads that don't change often (need to have detailed HD maps that are updated a lot is gone)

VERY simple weather (generally dry and sunny more than 90% of days)

Seems like they're stuck in one of those local maximums Elon talked about a while back.
LOTS of places have those things. Heck, the other 2950 square miles in greater Phoenix certainly has those things. Why didn't Waymo at least expand there? They ordered 82,000 vehicles. That's more than enough to cover Phoenix and Tucson.

I hate to keep repeating myself, but Waymo's problem is their business model. Forget the billions they've sunk into R&D and whatever -- their service itself loses money every day. They haven't found a path forward to it being able to make money. And they lack entrepreneurial leaders who iterate rapidly until they find that path forward. Assuming the path even exists -- despite the hopes of autonomy fans it's possible Robotaxis are simply a small niche market.
 
I hate to keep repeating myself, but Waymo's problem is their business model. Forget the billions they've sunk into R&D and whatever -- their service itself loses money every day.


If only they hadn't wasted all that money on LIDAR and HD maps :p

despite the hopes of autonomy fans it's possible Robotaxis are simply a small niche market.

Reliable vision-only RTs, in low-TCO EVs, it'd be almost impossible NOT to make money on- and in significantly larger amounts than currently profitable taxi offerings in a lot more areas.

The only question is if Tesla will be able to deliver that product into existing.
 
If only they hadn't wasted all that money on LIDAR and HD maps :p



Reliable vision-only RTs, in low-TCO EVs, it'd be almost impossible NOT to make money on- and in significantly larger amounts than currently profitable taxi offerings in a lot more areas.

The only question is if Tesla will be able to deliver that product into existing.
I really don't care about whether or not my two Teslas will be RTs. I do care about getting an analogous product to NoA on city streets and more complex highways. Driver assistance at L2 or better.

Let's get real - HW3 will be very unlikely to become capable of being a RT. (Remember, there's a steering wheel there!)
 
I really don't care about whether or not my two Teslas will be RTs. I do care about getting an analogous product to NoA on city streets and more complex highways. Driver assistance at L2 or better.

Well, at L2 is what the FSD beta is doing. It's...not quite done yet :)


Let's get real - HW3 will be very unlikely to become capable of being a RT. (Remember, there's a steering wheel there!)


HW3 for sure won't- not enough compute for redundancy from node A to node B.

HW4 possibly could- but nobody (Tesla included) will know that until they do it.
 
  • Like
Reactions: powertoold