Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

LiDAR - NASA confirms cameras better than LiDAR - Musk is correct

This site may earn commission on affiliate links.
Tesla has a front radar for TACC but the main sensor that Tesla is using for autonomous driving are the 8 cameras. Tesla's entire autonomous driving approach is really based on training the computer vision to understand the world around the car. In contrast with the "lidar approach" which is short hand for the other autonomous driving approach which also uses cameras and radar but involves lidar as a primary sensor for vision.

Excellent short synthesis.
I don't think any user is concerned of what method is used to get the results, as long the as the system gives the result.

But unfortunately, the very first step of automation, the quoted approach to TACC in Tesla system, is in actual fact vastly inferior to any common and cheap systems fitted on other cars, due to the too frequent phantom braking problem, still largely unsolved as all of you knows on TMC, but too many decided to ignore what does it implies.

And given that adaptativity cannot be excluded, as in most cars with Adaptive CC, if you don't want to be stressed waiting for a phantom brake with the foot on the gas pedal, you end up with a car without even a simple CC.

Is it unreasonable to wait when the Tesla system simply described by @diplomat33 will work safely on par with a $1200 plain Level 2 suite with adaptive CC and simple lane keeping, available in other cars, to compare different approaches? How to talk of Autopilot if you cannot rely on safe straight line automation?

Because if this is not possible for the low camera resolution and too excessive computing power and related absorbed power from the battery required, as someone in the field is claiming, perhaps some different layout and components will be required. And theoretical ambitions killed by the reality of limitations.
 
Last edited:
  • Like
Reactions: cucubits
But unfortunately, the very first step of automation, the quoted approach to TACC in Tesla system, is in actual fact vastly inferior to any common and cheap systems fitted on other cars, due to the too frequent phantom braking problem, still largely unsolved

Yeah, they really went backwards on that one. On the old AP1 cars there are extremely rare phantom braking occurrences. They must have messed something up with their 4-years-old "new" software since they started AP2, 2.5, 3 hardware.
 
  • Disagree
Reactions: PhilDavid
On the old AP1 cars there are extremely rare phantom braking occurrences. They must have messed something up with their 4-years-old "new" software

I think the reason AP1 has less phantom braking is because it doesn't handle as many situations. For example, it doesn't have cut in detection, doesn't do sharp corners, has less advanced emergency braking detection, etc.
 
I think the reason AP1 has less phantom braking is because it doesn't handle as many situations. For example, it doesn't have cut in detection, doesn't do sharp corners, has less advanced emergency braking detection, etc.

Yes, most likely that's the reason but then how are they going to fix this? Just keep fine tuning what's ignored and what's not so less and less highway overpasses will be seen as something else and make the car brake? That's not a solution, one will never be able to really trust the car.
 
  • Disagree
Reactions: PhilDavid
Yes, most likely that's the reason but then how are they going to fix this? Just keep fine tuning what's ignored and what's not so less and less highway overpasses will be seen as something else and make the car brake? That's not a solution, one will never be able to really trust the car.
You don't fix it by ignoring anything.
You fix it by training the neural network to handle the problem.
It will help considerably to be able to make the prediction on a fully 3D view of your environment - which the rewrite should bring.
 
I believe phantom braking is solved by reducing uncertainty and giving the planner a more accurate simulation of what’s going on around the car.

In my lowish 15 month experience with the FSD option, Phantom braking has evolved, sometimes gotten worse but over time it has been for different reasons. The typical progression for me was:

1) damn, shadows on the highways
2) overpasses when the roads dips down (I thought I heard Elon stating that their current “2.5D” approach assumes the world is flat)
3) suddenly last summer it started slowing down for cars it sees about to merge on the highway
4) when I’m on a bridge, in construction with concrete barriers on each side, it doesn’t want to pass cars in the adjacent lanes
5) now it’s the stop light recognition fast positives

my point here is that at least in my experience, Tesla has improved or solved each one progressively and when they take me by surprise it’s usually because it’s linked to a new ability of the autopilot system.
As the uncertainty of that feature decreases over time, I notice the phantom braking disappearing.

We should continue to expect phantom braking until they’ve built a system that’s confident to handle just about anything over time and training.
The alternative is to build a dumb system with lots of confidence.

HW3 is like a child suddenly realizing that there’s this big thing around them called the universe while AP1 would more resemble a drunk monkey.

I’m crossing my fingers that 4D autopilot solves the phantom braking with valleys and peaks in the road.
 
  • Love
Reactions: Brando
I think your title is a bit misleading and click bait. The scientist never said that lidar failed or that cameras are inherently better. He simply said that in this particular instance lidar did not provide the resolution they needed and that cameras were a more ideal solution to this particular problem.
That literally means lidar failed and cameras are better. You tried to sneak in the word "inherently" to build a strawman but failed as well.
 
That literally means lidar failed and cameras are better. You tried to sneak in the word "inherently" to build a strawman but failed as well.

No. The point is that cameras being better in one situation (space docking) does not prove that they are better in a completely different situation (FSD). And the title says "Musk is correct" implying that the scientist has proved that vision-only for autonomous driving is correct. But the article does not prove that at all. The fact is that all AVs use cameras. They use both cameras and lidar for different reasons. It was never a question of cameras being better than lidar or lidar being better than cameras.
 
Again, that's another strawman. Why would NASA comment on other applications?

I am NOT talking about NASA. I am talking about the OP. The OP title is "lidar - NASA confirms cameras better than lidar - Musk is correct". The OP made the claim that NASA proved that vision-only for FSD is correct. The OP is making a false claim since NASA did not say that.
 
  • Like
Reactions: qdeathstar
Lead Scientist Behind NASA's Asteroid Mission Talks About The Biggest Problems They Solved

From Scott Manley YouTube Channel:
Scott Manley

Very interesting interview, IMHO.

Especially note how LiDAR failed, and NASA moved to a camera only solution. Musk opinion validated? I think so.
Once level 5 is a solved problem, we can make bold statements. Until then…. Not so much.
 
  • Like
  • Disagree
Reactions: !igs and diplomat33
Once level 5 is a solved problem, we can make bold statements. Until then…. Not so much.
Let's look at the past and present:

Without LIDAR, no one could reach the finish line for the first DARPA challenge in 2004.

By the 2007 DARPA Urban Challenge (Youtube), there were about 36 participants but only 6 were able to finish the race and they all had LIDAR.


1) Carnegie Mellon University, Tartan
DARPA_UC_prvw_SS10_440.jpg


2) Stanford, Junior
junior_racing.jpg


3) Virginia Tech, VictorTango
urban350-jpg.jpg


4) MIT, Talos:
200908311113035239_0.jpg


Finished the race among the 6: University of Pennsylvania, Little Ben
site_visit_team_photo.jpg


Finished the race among the 6: Team Cornell’s Skynet

1657131294819.png


Waymo has been able to use LIDAR to first let the blind man ride in its Prius with no human driver in 2012:


Waymo has been letting its driverless cars drive the public (not just NDA riders) around since 2020.

Companies that don't have LIDAR can't do that today.

With the advance of Tesla Vision, it still collides with stationary objects, and AI Addict were expelled from FSD beta as an example.


In my opinion: Collision Avoidance Technology: LIDAR has been proven since the 2007 DARPA Urban Challenge. In the meantime, Tesla Vision has been proven to be unreliable for Collision Avoidance Technology in the past and present because it still requires a human driver to blame.

What needs now in the LIDAR-equipped system is intelligence.
 
Last edited:
I am NOT talking about NASA. I am talking about the OP. The OP title is "lidar - NASA confirms cameras better than lidar - Musk is correct". The OP made the claim that NASA proved that vision-only for FSD is correct. The OP is making a false claim since NASA did not say that.
Yes you are. You are talking about NASA not the OP. You wrote:

"The scientist never said that lidar failed or that cameras are inherently better. He simply said that in this particular instance lidar did not provide the resolution they needed and that cameras were a more ideal solution to this particular problem."

You are conflating the scientist with the OP and trying to mislead everyone.
 
  • Disagree
Reactions: qdeathstar
Yes you are. You are talking about NASA not the OP. You wrote:

"The scientist never said that lidar failed or that cameras are inherently better. He simply said that in this particular instance lidar did not provide the resolution they needed and that cameras were a more ideal solution to this particular problem."

You are conflating the scientist with the OP and trying to mislead everyone.

No, I am not misleading anyone.

What I said about the scientist is true. The NASA scientist said that cameras were better than lidar for space docking. He never said that vision-only was better for FSD.

The OP is using the scientist's words about cameras being better for space docking as "proof" that Elon is right about vision-only FSD. The OP is making a connection where there is none.

The scientist is not wrong. I am simply pointing out that the OP is wrong to try to use the scientist's words as "proof" that Elon is right about vision-only for FSD. The OP is twisting what the NASA scientist said, not me.
 
Last edited:

If lidar were pointless as Elon suggests, then we would expect companies to get rid of lidar and only use radar. Yet, every major AV company is still using lidar very successfully, with radar, and several automakers are adding lidar to future consumer cars. So lidar does not appear to be pointless at all. They clearly see an advantage in lidar that Elon is missing.
 
If lidar were pointless as Elon suggests, then we would expect companies to get rid of lidar and only use radar. Yet, every major AV company is still using lidar very successfully, with radar, and several automakers are adding lidar to future consumer cars. So lidar does not appear to be pointless at all. They clearly see an advantage in lidar that Elon is missing.
Of course. I just thought it was funny that SpaceX uses LIDAR for docking which by OPs logic would "prove" that LIDAR is superior to cameras. It's application dependent!
I think Elon may be hoping that synthetic aperture RADAR can get close to the resolution of LIDAR. I have no idea what's possible though.
 
  • Like
Reactions: diplomat33