Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
Few tidbits Amnon Shashua CES 2021

Level 4 consumer car 2025. Includes cameras, imaging radar and Intel’s proprietary Lidar.

Before that robotaxis with Luminar lidar

HD maps are collected completely automatically. Data provided by 1 million cars, 6 OEMs. Global coverage.

Shashua believes that by using vision only it is not possible to achieve adequate safety.
 
Last edited:
  • Like
Reactions: diplomat33
Few tidbits Amnon Shashua CES 2021

Level 4 consumer car 2025. Includes cameras, imaging radar and Intel’s proprietary Lidar.

Before that robotaxis with Luminar lidar

HD maps are collected completely automatically. Data provided by 1 million cars, 6 OEMs. Global coverage.

I am impressed by Mobileye. I think they have all the right ingredients to do really well in delivering autonomous cars to the public.
 
Wow, Mobileye is so ahead, they must have mapped our entire solar system.

View attachment 626743
This is just typical of companies that have nothing to deliver.
They will throw out stats like "1 billion kilometers globally, with more than 8 million kilometers mapped daily."
Or in Waymo's case, x number of autonomous miles, or disengagement rate of x.

Look at the shiny numbers, and lidar, oh and we can't do (by year 2025) what those other guys are already doing (in 2021), sooooo we're going to say that what those guys are doing is not safe!
Think of the children!!
 
Shashua believes that by using vision only it is not possible to achieve adequate safety.

There are a lot of ways to skin a cat and it is good for technology advancement that different companies use different strategies. One might say that this comment could be interpreted as self-serving, but if Lidar can be made affordable by technological advances, then maybe it will become a necessary sensor for AVs.

Isn't the issue that Elon has with Lidar that it is just too expensive to use in consumer vehicles?
 
Isn't the issue that Elon has with Lidar that it is just too expensive to use in consumer vehicles?
No, Elon's point is that there are too many drawbacks to lidar.

2016 Elon https://twitter.com/elonmusk/status/753843823546028040
upload_2021-1-11_16-15-2.png



2020 Elon https://twitter.com/elonmusk/status/1329847917834870784
upload_2021-1-11_16-15-26.png
 
At 21:30 in response to the question about REM to generate maps and not training data, Amnon seems to suggest Tesla's approach with corner case event recording from shadow mode triggers to save all camera data and send back over WiFi is a natural / "reasonable" approach but brute force solution for a crappy "beta" system that will eventually hit a glass ceiling.

He then says Mobileye has solved perception years ago with very little data, and the actual bottlenecks is understanding semantics of the road: where are the driveable paths, what are relevancy of paths and traffic lights, where to stop for intersections, what are priorities for yielding/or-not. And he suggests figuring those out with high reliability in a single pass "online" would be almost unachievable no matter how good the perception gets, so that's why they went with multi-pass crowdsourcing data to map out every static semantic information about a scene at centimeter accuracy while only uploading 10KB per kilometer traveled across the fleet.

Seems like if Mobileye wasn't limited by the bandwidth / network costs, they wouldn't have needed to engineer a solution over 5 years to extremely selectively send back data. Definitely an interesting engineering problem with various tradeoffs, so I wonder what someone could do with a fleet without those restrictions.
 

Elon is incorrect about the visible spectrum. Lidar does not operate in the visible spectrum. Lidar operates either at 905 nm or 1550 nm. The visible spectrum is 380 nm to 750 nm. Lidar operates well outside the visible spectrum, in the infrared range.

Velodyne's Guide to Lidar Wavelengths | Velodyne Lidar
 
He then says Mobileye has solved perception years ago with very little data, and the actual bottlenecks is understanding semantics of the road: where are the driveable paths, what are relevancy of paths and traffic lights, where to stop for intersections, what are priorities for yielding/or-not. And he suggests figuring those out with high reliability in a single pass "online" would be almost unachievable no matter how good the perception gets, so that's why they went with multi-pass crowdsourcing data to map out every static semantic information about a scene at centimeter accuracy while only uploading 10KB per kilometer traveled across the fleet.

He's right.
 
He then says Mobileye has solved perception years ago with very little data
rofl... these are all perception.
Just because you have pictures of roads doesn't mean you "solved perception"
where are the driveable paths, what are relevancy of paths and traffic lights, where to stop for intersections

They could claim to have "solved" perception when they could recreate the scene from the outputs of the NN's
Similar to Tesla's Birds Eye View.
It seems to me like Mobileye could not determine depth and distance with cameras very well, otherwise he would not be making idiotic statements like "where to stop for intersections", if you can see the intersection and the stop line what is your problem with stopping in time?
 
Elon is incorrect about the visible spectrum. Lidar does not operate in the visible spectrum. Lidar operates either at 905 nm or 1550 nm. The visible spectrum is 380 nm to 750 nm. Lidar operates well outside the visible spectrum, in the infrared range.

Velodyne's Guide to Lidar Wavelengths | Velodyne Lidar

@mspisars What are you disagreeing with? 905 nm and 1550 nm comes from the lidar manufacturer. And the visible spectrum is 380 nm to 750 nm. That's high school physics. And last time I checked, 905 nm and 1550 nm are outside the range of 380-750 nm.
 
  • Disagree
Reactions: mikes_fsd
rofl... these are all perception.
Just because you have pictures of roads doesn't mean you "solved perception"


They could claim to have "solved" perception when they could recreate the scene from the outputs of the NN's
Similar to Tesla's Birds Eye View.
It seems to me like Mobileye could not determine depth and distance with cameras very well, otherwise he would not be making idiotic statements like "where to stop for intersections", if you can see the intersection and the stop line what is your problem with stopping in time?

Educate yourself PLZ. But alas who am i kidding. You are a tesla fan. Uneducation, myths and fables is the way to go.


 
Somebody severely fumbled that summary. At around 27:00 of the video, he states that they recorded (as of half a year ago) 8 kilometers of data daily, not that they mapped 8 kilometers of road daily.


So they meant to say 1B km of data globally, not 1B km of roads and 8M km of data daily, not 8M km of roads daily. That explains the discrepancy. @powertoold
 
@mspisars What are you disagreeing with? 905 nm and 1550 nm comes from the lidar manufacturer. And the visible spectrum is 380 nm to 750 nm. That's high school physics. And last time I checked, 905 nm and 1550 nm are outside the range of 380-750 nm.

You could tell them 1+1 = 2 and they will say no...because their gawd elon said otherwise. Its quite sad. Having zero original thought.
 
@mspisars What are you disagreeing with? 905 nm and 1550 nm comes from the lidar manufacturer. And the visible spectrum is 380 nm to 750 nm. That's high school physics. And last time I checked, 905 nm and 1550 nm are outside the range of 380-750 nm.
disagreeing with you being pedantic... you trying to imply that Elon does not know what spectrum lidar operates in just because he tweeted visible instead of laser spectrum!
and the fact the the second tweet clarifies his point, "occlusion penetrating wavelength"!

first tweet has enough context too, with "radar... can see through rain, snow, fog and dust"

I swear, it is like you want to be dense on purpose! :rolleyes:


ElectromagneticSpectrum.jpg
 
Last edited:
  • Disagree
Reactions: diplomat33
rofl... these are all perception.
Just because you have pictures of roads doesn't mean you "solved perception"

What do you mean? What's "these"? His wording is likely lax in general, but I take it that his use of "perception" isn't colloquial, and it is a particular set of tasks distinct from other components of autonomous driving: planning and control. The academic literature on the matter is so vast that I'm not surprised by their confidence in perception, which comprises far more that mere pictures of roads.

mspisars said:
just because he tweeted visible instead of light spectrum!

He tweeted "visible wavelength". You think he meant "(which is light spectrum)" instead? The entire light spectrum? At best he meant just "light". That's not just nonsensical, it's also pretty far-fetched for a typo. Unless @diplomat33 and his source are factually wrong (noting that the source is a LiDAR manufacturer), you're being petty, again.
 
rofl... these are all perception.
Seems like Amnon's usage of "perception" is referring to dynamic behaviors, e.g., road users, light colors; so everything else that's static can be premapped including road geometry paths, road boundary curbs, road semantic relevancy. Of course Mobileye has dynamic versions for the "static" stuff to generate the REM data as well as realtime driving if things change, but as he pointed out, their online solution is not as reliable as taking multiple passes at a scene. Whereas Tesla is betting on correctly perceiving the dynamic and static aspects on the first pass.
 
Seems like Amnon's usage of "perception" is referring to dynamic behaviors, e.g., road users, light colors; so everything else that's static can be premapped including road geometry paths, road boundary curbs, road semantic relevancy. Of course Mobileye has dynamic versions for the "static" stuff to generate the REM data as well as realtime driving if things change, but as he pointed out, their online solution is not as reliable as taking multiple passes at a scene. Whereas Tesla is betting on correctly perceiving the dynamic and static aspects on the first pass.
Good perception is seeing the reality and being able to recreate it accurately virtually on the fly...
Maps provide some context, but are not the source of truth.

Hence my ROFL.
 
Seems like Amnon's usage of "perception" is referring to dynamic behaviors, e.g., road users, light colors; so everything else that's static can be premapped including road geometry paths, road boundary curbs, road semantic relevancy. Of course Mobileye has dynamic versions for the "static" stuff to generate the REM data as well as realtime driving if things change, but as he pointed out, their online solution is not as reliable as taking multiple passes at a scene. Whereas Tesla is betting on correctly perceiving the dynamic and static aspects on the first pass.

When he says they've "solved perception", as much of an abuse of language as that may be, I take it that believe that they've achieved substantial accuracy in "correctly perceiving the dynamic and static aspects on the first pass", in the exact same way that Tesla would. At around 24:00, I interpret that Amnon believes that nonetheless, the accuracy achievable on a single pass is insufficient to achieve the performance Mobileye seeks, and hence their reliance on multiple passes. Note that he specifically says that this map-building is data-driven, meaning that each single pass must collect data automatically, and thus that the online solution is indeed functional (and indeed not as reliable as taking multiple passes at a scene).

I doubt that either Mobileye or Tesla are using anything less than state-of-the-art techniques for what you call the online solution, so, as I interpret it, the difference isn't that Mobileye (necessarily) applies a poorer online solution in favor of map-building with multiple passes, but rather than Mobileye builds a map in addition to online perception (which, again, because he claims they've "solved perception", I'm inclined to believe is in a good state, at least).