Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Lidar lost Tesla the robotaxi?

Outcome from Elon's insistence of not using a lidar?

  • Tesla will win over the robotaxi competition by 2024 due to lower production cost

    Votes: 13 14.6%
  • Tesla will win over the robotaxi competition after a few years due to lower production cost

    Votes: 15 16.9%
  • Competition already on the market with lidar will maintain the lead

    Votes: 15 16.9%
  • By the time Tesla launches a vision only robotaxi, competition have done the same

    Votes: 15 16.9%
  • Elon is forced to change his mind and will include lidar in a future Tesla model

    Votes: 31 34.8%

  • Total voters
    89
This site may earn commission on affiliate links.
Elon has mentioned that his decision to focus on Vision (cameras) is due to the heavy processing and competeting data generated by using both Vision and Radar at the same time. With both sensors operating the computer must take in the two data streams, compare their data and try to sort out which of the two senarios to follow. It can be a very difficult decision for the computer to sort out.
Competition handles this just fine. With the support from multiple inputs they are leaps and bounds ahead of Tesla in autonomy.

Thus would be wiser to believe what different companies have shipped rather than what Elon and others are tweeting.
 
  • Like
Reactions: 2101Guy
I keep coming back to the fact that people have been driving cars for the past 100 years using only their two eyes (and a sophisticated neural net that is our brain).
It's only a matter of time until the Tesla neural net becomes sophisticated enough to manage driving. It is not necessary for the Tesla net to be as capable as a human brain, just enough to drive. When you look at nature, we have small insects that manage to navigate complex 3D environments with very small neural nets. I think that the Tesla net could become as good as that of a fly.

Processing power: Could be that the neuron count of a fly is sufficient to get a drivers license and drive safely. Maybe we'll learn that one day. Today there is no evidence of that.

Neural network structure: Tesla and others use very different type of NNs compared to flys and humans. Maybe some day someone figures out how to build artificial comparable to what flys are using. This has been a research topic for decades. Today we are not even close. That said, it is likely that structure does not need to emulate organic brain, that general intelligence is not needed and that current deep learning like approach could one day prove to be sufficient.

Thus the question is: (A) Will Tesla R&D come up with neural network structure capable of driving autonomously and (B) does AP/HW3 have enough processing power to run that neural network. Some day Tesla may be able to give a positive answer to A. If they are, they might later be able to give a positive answer to B. Both are needed for current gen. cars to become autonomous.
 
Processing power: Could be that the neuron count of a fly is sufficient to get a drivers license and drive safely. Maybe we'll learn that one day. Today there is no evidence of that.

Neural network structure: Tesla and others use very different type of NNs compared to flys and humans. Maybe some day someone figures out how to build artificial comparable to what flys are using. This has been a research topic for decades. Today we are not even close. That said, it is likely that structure does not need to emulate organic brain, that general intelligence is not needed and that current deep learning like approach could one day prove to be sufficient.

Thus the question is: (A) Will Tesla R&D come up with neural network structure capable of driving autonomously and (B) does AP/HW3 have enough processing power to run that neural network. Some day Tesla may be able to give a positive answer to A. If they are, they might later be able to give a positive answer to B. Both are needed for current gen. cars to become autonomous.
I think a closer analogy would be a dragonfly. There have been studies on using dragonfly's as aids to augment robot vision based on how they predict prey's movements.

 
there are all kinds of problems with snow. LiDAR and radar aren’t reading snow covered road signs either. Bad weather is a usecase to be handled after fair weather fsd is solved.
When one sense or sensor fails, a smart being depends on what they have.

As a human, when driving in a snowstorm that has covered the street and signs, I use memorized "map": recalling what the signs were, looking at the environment and figure out where the road is.

Car should do the same. Use all input they can gather: radar, vision, map, sound, gps, ... And from those draw intelligent conclusions.
 
  • Like
Reactions: OxBrew
When one sense or sensor fails, a smart being depends on what they have.

As a human, when driving in a snowstorm that has covered the street and signs, I use memorized "map": recalling what the signs were, looking at the environment and figure out where the road is.

Car should do the same. Use all input they can gather: radar, vision, map, sound, gps, ... And from those draw intelligent conclusions.
I think we're a very long way from AI handling this:

1661378469430.png


The best we can do now is treat it like an unmarked residential road - and stay more towards the right. But the car can't tell where the curb is, or the embankment is. A good friend of mine was driving in Bigbear (a popular ski resort in Southern California) after a snowstorm. It was a bright, clear day, but he couldn't tell exactly where the edge was, and ended up with the right wheels dropping as he got too close to the edge and the snow was deeper there. He stopped safely, but had to have a friendly local with a truck pull him out.

Radar, lidar, and vision wouldn't see where the street ended and a deep snow area began. Only thing I can think of would be hyper-accurate map data with hyper-accurate GPS. The car would have to be able to stay centered in its lane on GPS alone.
 
At the extreme: ok, Tesla perfects FSD based on vision, let's say in 2 weeks, it's perfect* everyone loves it.

*In good weather.

They're working inside their own self-defined "geo" fence: good weather. Then what? Woops, it's raining hard, take over bub. I already pull over when it's unsafe to drive in bad weather. How is this helping? OK it's better, in good weather, than 90% or whatever. What percent of all accidents does that solve for? Not 100, that's for sure.

I think that's a deal killer. And they don't get ANY knock-on functionality that they could use in the future to enhance regular driving, ability to see through fog, rain, snow, darkness.

Just seems like a huge blind spot, and the result will be a gimmick for daytime good weather use. Cool story, bro (Elon).

[long boring parts of trips are already really well solved by free AP, so that's not part of the FSD benefit]

Since this thread is addressing Tesla's lack of lidar, then your bad weather argument is moot because the laser in lidar can't shine through inclement weather either. Ok, so now let's bring in radar, which CAN see through inclement weather. If the weather is so bad that vision is useless, can the car continue to drive on just radar alone? Not even close. Radar resolution is poor, and like lidar, it can't read signs or stop light status or see lane lines.

So in these hypothetical bad weather situations, the only solution for humans and machines is to drive way slower (or pull over, like you do), so we have more time to react to something unexpected.
 
I keep coming back to the fact that people have been driving cars for the past 100 years using only their two eyes (and a sophisticated neural net that is our brain).
It's only a matter of time until the Tesla neural net becomes sophisticated enough to manage driving. It is not necessary for the Tesla net to be as capable as a human brain, just enough to drive. When you look at nature, we have small insects that manage to navigate complex 3D environments with very small neural nets. I think that the Tesla net could become as good as that of a fly.
And yet, people keep regularly smashing into each other in hundred car pileups with dozens of fatalities when encountering fog or smoke. That is why I think radar should always be included. It doesn’t need to be high resolution to avoid a bunch of stopped or slow cars.
 
  • Like
Reactions: OxBrew
I think we're a very long way from AI handling this:

1661378469430.png
We're a long way from humans handling this. We do a very poor job of it. First snow of the season, thousands of otherwise good snow-drivers bump into each other like bumper cars, forgetting how little grip they have. I almost always slide through the stop sign at the end of the block on day one, even though I remember to think about it.

Sorry to conflate lidar with radar or any other non-visual sensor. But I fundamentally disagree that we should solve clear, daylight FSD before solving for bad weather and edge cases.

Do the hard(er) parts first, then the easier parts will be pretty much already done. In reality, it should be all at once, but in the end it's almost malpractice to sell something that does the easy part, make us complacent and let our guards down, not as ready to deal with the edge cases ourselves, in those rare instances we need to. Mode confusion is already a problem, this is like meta level mode confusion, confusing the ______ out of people about what the capabilities really are.

But whatever, it's still really cool to see the improvement, hope it's way better than I expect, way sooner.
 
Regardless of the technical merits, I'd much rather NOT have LIDAR shining in my eyes from oncoming traffic during my commute.
Some will explain away the dangers, but cannot account for the much longer time-weighted damage.
 
  • Like
Reactions: mspohr
I think they have worked really hard to avoid doing any of this. Maybe I'm wrong, but if you get into manual pathing then that data has a shelf life as things slowly change over time. I expect they have been taking the "high road" on this and trying to keep the pathing as a pure algorithmic programing exercise. I'd love to have one of the FSD engineers comment on that as some point.

It's just not feasible to have enough algorithmic programming to reason sufficiently well without artificial general intelligence. Even humans intuitively follow what other people are doing, particularly when at a novel intersection. Humans also typically drive similar routes and become familiar with unusual intersections and know what to do from their experience. Why do we frequently see tourists drive in the wrong lanes, and made sudden movements or stop where they shouldn't? Because even humans might be confused and not immediately make the right choice in new areas. How should a special purpose AI react? A crowdsourced semantic map would do even better than a human tourist on this.

Yes, the data will have a finite lifetime. It's not that hard to detect drifts, or even sudden changes in datasets with time stamps, or to make models where newer data have more weight than older data.

It would mean continual two way communication back to a server infrastructure for routing and it's possible that part has not yet been implemented, it's definitely not easy, as it would require updates between usual software version updates.
 
Since this thread is addressing Tesla's lack of lidar, then your bad weather argument is moot because the laser in lidar can't shine through inclement weather either. Ok, so now let's bring in radar, which CAN see through inclement weather. If the weather is so bad that vision is useless, can the car continue to drive on just radar alone? Not even close. Radar resolution is poor, and like lidar, it can't read signs or stop light status or see lane lines.

Radar resolution on some modern higher frequency chipsets is much better than before. See





I think they should be using semantic maps gathered from ground truth mapping companies and crowd sourced actual human driving, presumably gathered during good weather. Conceivably such a system might be better than humans if it knew where the lanes used to be under the snow.

Radar ADAS might be able to drive slowly if visibility is not completely zero. But humans would be reluctant to drive then too.
 
  • Like
Reactions: nvx1977
And yet, people keep regularly smashing into each other in hundred car pileups with dozens of fatalities when encountering fog or smoke. That is why I think radar should always be included. It doesn’t need to be high resolution to avoid a bunch of stopped or slow cars.
yes it does, as it has to distinguish stopped cars from low overhead bridges, and distinguish low overhead bridges from low overhead bridges *and* stopped cars, and when there are elevation changes up, and down, and up and down.

Your eyes and brain gimbal to notice the actual horizon and do much unconscious processing/compensation.

Radar and cameras are on jiggly fixed orientation sensors.
 
Last edited:
  • Like
Reactions: mspohr