Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Karpathy talk today at CVPR 2021

This site may earn commission on affiliate links.
Re: Best radar --- That seems confusing to me. I pointed out they have used two different radar components from well-known companies. I get the feeling you are implying 'cheap' outdated radar.

Yeah, the radar is outdated. There are newer 4D radar with greater resolution.

Re: how Tesla was doing radar --- That is puzzling too. Given all they've done with their software and level of tech folks they have it is very hard to believe that if it was reasonable to do that they would have figured it out. I'm not buying that they just couldn't figure it out well enough. I think that other companies may be doing it with more specific environments and more limited use cases ... Tesla's goal is that you can pick the car up and let 'FSD' work 'whereever' you drop the car. In other words, a more generic approach (plus at higher speeds).

I am sure Tesla could have figured it out if they really wanted to. But Elon is biased towards vision-only. So when faced with issues with radar, Elon was going to choose the option of ditching radar rather than the option to fix radar.

By the way, Cruise uses long range, high resolution radar on the A-pillar that can articulate to face different angles. The idea is that instead of using more sensors to cover every angle, they can use fewer sensors that can pivot to cover more angles or cover the angle that the car needs to focus on the most.


I wholely admit I wish they would keep radar for weather reasons. I've driven through some harsh rains and even in the past two weeks in the 'mountains' of VA that made me glad to have radar seeing farther ahead than my eyes (and thus the Tesla cameras could see -- even with wipers going max speed).

Yeah, radar is great for detecting moving objects and calculating velocities in conditions like fog, rain and snow. Without radar, Tesla may be limited in those conditions.
 
  • Like
Reactions: BitJam
True, but also true of any novelty, especially a technology one. Smart phones. Flat-screen TVs. HDTV. Heck, even the good old CD in its day. The true test of any technology is: do people keep using it once the novelty wears off?

Definitely. I still use AP everyday, long after the novelty factor has worn off. And if it is good enough, I will definitely use FSD beta every day.
 
Here's what I noticed in the video:

1) Karpathy's laptop desktop is filled with Tesla dashcam screenshots, probably of interesting scenarios:

Screen Shot 2021-06-21 at 4.48.05 PM.png


2) There's an impressive PMM (pedal misapplication) example where the car sees that there's a pond ahead and applies the brakes. This highlights Tesla's drivable space NN:

Screen Shot 2021-06-21 at 4.52.59 PM.png


3) He mentions there are 2000 "customers" with FSD beta, but that's a bit disingenuous since most of those are Tesla employees (albeit they did buy their cars).

4) Tesla is now certain that vision alone can do rangefinding for FSD. Prior, the industry had not solved this.

5) Karpathy mentions auto-labeling in the context of position, velocity, and acceleration (PVA). The object recognition NN helps put bounding boxes around cars, and then the radar / lidar data is automatically applied to each bounding box. Tesla is able to do auto-labeling in this context because they already have millions of manually labeled vehicles, so the purpose of auto-labeling here is simply to include PVA.

6) Tesla's entire AI team was focused on solving this vision problem for 4-5 months. It seems that FSD was limited by its dependence on radar, and Tesla needed to double-down on vision to order to make a "step change." It's also possible that there was an impending radar shortage, and Tesla needed to solve this problem.

7) Karpathy says the training data for PVA included 1 million clips from all 8 cameras. It's unclear how Tesla is labeling PVA for the side and rear facing cameras, as they don't have the radar labels for those.

8) It's unclear if Karpathy's comments about radar are only related to Tesla's radar or about radar in general.

9) Tesla is now using the same car auto-labeling techniques for pedestrian PVA (see the trajectory prediction arrows):

Screen Shot 2021-06-21 at 5.45.49 PM.png
 
Also Mobileye's current autonomous demo vehicle is vision only...

Let us know when they deploy their car / FSD to customers

Also, this is more relevant to vision-only deployment to customers:


But then there's this:

 
  • Like
Reactions: rxlawdude
View attachment 676047
Also Mobileye's current autonomous demo vehicle is vision only...

Not only that but Honda has been deploying camera only L2 systems in several of their models using Mobileye's EyeQ4 since 2020. @powertoold ofcourse is off.




 
6) Tesla's entire AI team was focused on solving this vision problem for 4-5 months. It seems that FSD was limited by its dependence on radar, and Tesla needed to double-down on vision to order to make a "step change." It's also possible that there was an impending radar shortage, and Tesla needed to solve this problem.

Tesla has repeatedly gone done a given path, only to find it dead ends.

In 2016 we were told radar would be the PRIMARY sensor, vision secondary, and why that was awesome and totally the way to go.

In 2019 (though I think this wasn't stated until Jan 2020, and then stated as 'nearly finished') we were told there'd be an almost total re-write based on fusing vision and radar using the BEV/4D framework and why THAT would be awesome and totally the way to go.

Now in 2021 we're told radar+vision is inferior to vision alone so they're dropping radar and why THAT will be awesome and totally the way to go.


On the one hand, it's frustrating when the story changes.

On the other hand, Tesla is willing to realize and admit their approach isn't working and try something else they think will... 3 different systems in 5 years now.


Meanwhile Waymo is over here since 2009 pounding away at the same EVERY SENSOR EVER fusion approach and hey, they've got a few L4 taxis that kinda sorta mostly work in one specific perfect weather city with simple roads... (for context of missing dates BTW, Google originally claimed they'd be selling self driving cars by 2017)
 
Now in 2021 we're told radar+vision is inferior to vision alone so they're dropping radar and why THAT will be awesome and totally the way to go.

Do you think Tesla changed their story about radar? I think they were just touting the advantages of radar, but ultimately, radar's failure modes outweighed its benefits, so they're doubling down on vision. It still follows their vision-only narrative for the last few years.
 
  • Informative
Reactions: scottf200
Do you think Tesla changed their story about radar?

Yes- they literally did.


I'd like to see proof tesla said radar would be the primary source of data for self driving.


Ok. (though that's not EXACTLY what I actually said) but here's what I was actually talking about.)




While there are dozens of small refinements with Version 8 of our software, described in addendum below, the most significant upgrade to Autopilot will be the use of more advanced signal processing to create a picture of the world using the onboard radar," reads a company announcement. "The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system. After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition

(bold added)

The story is citing from a post from tesla in 2016 stating radar would be the primary sensor ahead of vision/image.

The post went on at length regarding the great advantages of using radar as primary like seeing the car ahead of the car ahead of you.

Tesla removed that post 6 weeks ago when they announced they were removing radar from cars but there's tons of stories referencing it.


Here's another with even MORE proof-

it cites Elon himself here-

He also called using the radar as a primary sensor "a very hard problem" to solve, claiming that "no other" manufacturer could have done it without connecting all their cars to the cloud and using fleet learning

He then goes on, hilariously, to explain how he kept being told it couldn't be done but he pushed to do it!

"It's something that I've wanted to do for a while. But I was always told that it's not possible. You can't do it, it's not going to work," said Musk in response to a question from The Verge. "I really pushed hard on question all those assumptions in the last three to four months. There's gotta be a way to make this thing work and now we believe there is."

And finally the story mentions the radar bounce advantage I mentioned-

Tesla's cars will also be able to "bounce" the radar signal off of the ground beneath the car in front of it, giving some indication of what's happening in front of that car, out of sight of both the driver and the camera system. Tesla says that this could even prevent an accident where the leading car crashes into an object in dense fog but the trailing Tesla does not.


That sufficient proof for you?
 
Not sure if Tesla ever said that radar would the primary source of data but in 2016, Elon did tout decoupling radar from cameras and using radar to create lidar-like point clouds. It obviously didn't pan out.

I think that tweet is saying that Tesla was trying to use radar alone to generate a point cloud (without the input from cameras). I don't think it has anything to do with FSD.
 
  • Like
Reactions: rxlawdude
The story is citing from a post from tesla in 2016 stating radar would be the primary sensor ahead of vision/image.

The post went on at length regarding the great advantages of using radar as primary like seeing the car ahead of the car ahead of you.

Tesla removed that post 6 weeks ago when they announced they were removing radar from cars but there's tons of stories referencing it.

Nah, you're misinterpreting "primary" sensor. That article is simply saying that Tesla intends to do sensor fusion with radar and cameras as the primary sensors. This was a result of the infamous AP under-truck death.

Tesla avoided doing this sensor fusion because of all the false-positives under bridges. That's why the article mentioned having a geocoded "whitelist" for bridges and whatnot to avoid the radar false-positives.
 
I'd like to see proof tesla said radar would be the primary source of data for self driving.
They have removed the blog entry but this article references it: Tesla Autopilot Upgrade Will Make Radar A Primary Control Sensor

After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar.
 
  • Like
Reactions: BitJam
I think that tweet is saying that Tesla was trying to use radar alone to generate a point cloud (without the input from cameras). I don't think it has anything to do with FSD.

But what would they use it for other than FSD? Of course it would be for FSD. The only reason to try to create lidar-like point clouds with radar, separate from cameras, would be that Elon was hoping radar could serve the same function as lidar for perception.
 
mysterious disappearing blog post...

I really dont remember that, seems like it was being elevated to at least equal priority to vision if not higher, I dont like trusting articles based on articles when tesla is involved but then they pulled the blog post...

People are misinterpreting "primary". Blue is a primary color. However, it is not the ONLY primary color :p

Radar was upgraded to a primary sensor, along with cameras.
 
Nah, you're misinterpreting "primary" sensor.

Nope.

It's pretty damn clear.

He went on at length about how important it was in many interviews at the time, pointing out the advantages of seeing cars ahead, the ability to see through bad weather, etc.... and between that his "pushing" the team as quoted they decided it could be used as a primary control sensor WITHOUT imaging/vision back then.

Different story today of course.

That article is simply saying that Tesla intends to do sensor fusion with radar and cameras as the primary sensors. This was a result of the infamous AP under-truck death.

No, it literally says the opposite of that.

Tesla said:
we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition


Certainly they were still going to use the cameras- but, again as bolded in the previous cite, radar was originally ONLY secondary. From 2016 it was a primary control sensor. The car could and would act on radar input alone




Tesla avoided doing this sensor fusion because of all the false-positives under bridges. That's why the article mentioned having a geocoded "whitelist" for bridges and whatnot to avoid the radar false-positives.


...so the article says they were going to do fusion... but then they didn't do fusion... except we know they DID do fusion and the recent change is them giving up on it?



None of which changes them saying in 2016 radar would be a primary control sensor without needing cameras confirming anything.
 
  • Like
Reactions: rjpjnk and BitJam
mysterious disappearing blog post...

I really dont remember that, seems like it was being elevated to at least equal priority to vision if not higher, I dont like trusting articles based on articles when tesla is involved but then they pulled the blog post...

I understand that. Here is the blog post from the WaybackMachine: Upgrading Autopilot: Seeing the World in Radar

Upgrading Autopilot: Seeing the World in Radar​


The Tesla Team September 11, 2016
While there are dozens of small refinements with Version 8 of our software, described in addendum below, the most significant upgrade to Autopilot will be the use of more advanced signal processing to create a picture of the world using the onboard radar. The radar was added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but was only meant to be a supplementary sensor to the primary camera and image processing system.

After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar. Photons of that wavelength travel easily through fog, dust, rain and snow, but anything metallic looks like a mirror. The radar can see people, but they appear partially translucent. Something made of wood or painted plastic, though opaque to a person, is almost as transparent as glass to radar.