Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Upgrading Autopilot: Seeing the World in Radar

This site may earn commission on affiliate links.
"After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar."

They don't sound too sure of themselves to me and they don't use the word "beta" once in the press release. I wonder how it made it through their legal department.
 
"After careful consideration, we now believe it can be used as a primary control sensor without requiring the camera to confirm visual image recognition. This is a non-trivial and counter-intuitive problem, because of how strange the world looks in radar."

They don't sound too sure of themselves to me and they don't use the word "beta" once in the press release. I wonder how it made it through their legal department.

Because they say that they don't implement that feature before enough fleet learning data has been collected.

"Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist."
 
  • Like
Reactions: Falkirk
Because they say that they don't implement that feature before enough fleet learning data nas been collected.

And to that, they say:

This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.

To me, this is the most concerning part of the press release. They seem to be suggesting that the driver need not pay attention. In my opinion, it is essential for Tesla so also say this:

This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants, if for some reason the driver fails to follow the instructions and take control of the vehicle to avoid a potential collision, as required by auto-pilot, which is a "driver assistant" system and not autonomous self-driving.

But for some reason Tesla seems to want the public to believe that their cars have "auto-pilot", as the vast majority of the public understands that term, but not as Tesla defines it after an accident while it is activated. Tesla should use their press release to further educate people on what auto pilot really is, rather than to continue to play into the notion that it autonomous driving.

When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn't notice the object ahead.

So does this mean that hard braking event are rare but mild braking will be common?
 
Interesting, what can be achieved with this approach and autopilot 2.0 hardware ( there were rumors there will be more radars ).

For example it will be easier to discriminate discarded soda can - because radically different shapes from different radars will tell - it is for sure a soda can and not really bad object in front of driver

anyhow seems a great step for unique Tesla approach to driverless cars.
 
  • Like
Reactions: TrendTrader007
So does this mean that hard braking event are rare but mild braking will be common?

I take that to mean when the car is pretty sure there's something real there (i.e. braking for something that isn't there is what I interpret as a "false braking event", so this statement is saying "when the odds of it being a false event are low"), it will begin braking just based off radar data even if the camera doesn't see anything that might otherwise trigger braking.

I could be wrong though.
 
Last edited:
  • Like
Reactions: Saghost
Would suck if there just happened to also be an object in the road at one of these "whitelisted" locations where radar doesn't work.

The object would have to be in the exact location of the geotagged object - and depending on how Tesla is handling it, they still might pick up that the object was more reflective than normal.

We know that they are already measuring the intensity of the return even in 7.1 - I'm nearly positive that's how it is deciding whether to show a car, truck, or motorcycle on the display (I've seen it make mistakes in both directions, always with things which have geometry that should affect the radar return strength.)

Assuming they include that data in the whitelist, they should pick up the extra reflector unless it is small enough to fall within equipment or environmental effect tolerances.

And, of course, the driver should be paying attention - you can bet that if the car is silently watching for false positives, it'll be just as aware of the driver suddenly slamming on the brakes or jerking the wheel when the car thought it was in the clear, and updating Tesla based on that.
 
The amount of information they are collecting from the cars, in addition to the knowledge gained by implementing this upgrade, is absolutely staggering. And they are building the database of objects to be ignored. All that information and software will be analyzed and used to drive the next version of the hardware platform.

Textbook example of wringing every possible advantage out of the already deployed hardware, and refining the design of the follow on system.

Someone mentioned that a visual indicator to touch the wheel would work better if there were a heads up display. That way, you are also insuring the driver is also looking ahead at whats going on. That's why fighter planes have HUD's, and why the model 3 will also have one.

The media can pile on all they want with each corner case that causes an accident, but the machine learning system is in place that is only going to get better over time. Lives will be saved. That's the scorecard that matters, not whether the system is "perfect" or not, whatever that means.

RT
 
The object would have to be in the exact location of the geotagged object - and depending on how Tesla is handling it, they still might pick up that the object was more reflective than normal.

We know that they are already measuring the intensity of the return even in 7.1 - I'm nearly positive that's how it is deciding whether to show a car, truck, or motorcycle on the display (I've seen it make mistakes in both directions, always with things which have geometry that should affect the radar return strength.)

Assuming they include that data in the whitelist, they should pick up the extra reflector unless it is small enough to fall within equipment or environmental effect tolerances.

And, of course, the driver should be paying attention - you can bet that if the car is silently watching for false positives, it'll be just as aware of the driver suddenly slamming on the brakes or jerking the wheel when the car thought it was in the clear, and updating Tesla based on that.
I'm virtually certain they use the camera to tell a car from a truck, not the radar. I covered the camera on my parked Tesla and then placed another car in front of it. It did not get displayed on the IC until I uncovered the camera.
 
  • Like
Reactions: Matias
I'm virtually certain they use the camera to tell a car from a truck, not the radar. I covered the camera on my parked Tesla and then placed another car in front of it. It did not get displayed on the IC until I uncovered the camera.

Hmm. It sees flatbed trucks and bobtail semi tractors as cars, and some cars/minivans with vertical rears as trucks. I'm surprised the camera would make that mistake with the semi cab, but I can't say for certain. The other two I can see the camera making the same mistake I assumed the radar was.
 
  • Like
Reactions: scottf200
"The net effect of this, combined with the fact that radar sees through most visual obscuration, is that the car should almost always hit the brakes correctly even if a UFO were to land on the freeway in zero visibility conditions."

Hmm..so when Elon's martian cousins land on earth, at least Tesla cars won't hit them. :p

Unless, of course, the UFOs are made of wood...
 
Hmm. It sees flatbed trucks and bobtail semi tractors as cars, and some cars/minivans with vertical rears as trucks. I'm surprised the camera would make that mistake with the semi cab, but I can't say for certain. The other two I can see the camera making the same mistake I assumed the radar was.

The Deep neural network in the Mobile Eye chip is what does the object recognition. It uses the front camera.

You can play around with object recognition if you have a mid->high end Nvidia graphics card.

GitHub - NVIDIA/DIGITS: Deep Learning GPU Training System

I used the latest version to create a Stig detector (the character from top gear).
 
I was a little perplexed by the UFO mention on the Blog. If some alien civilization had the technology to travel here then surely they would have radar masking technology as well.

But would their radar masking technology be compatible with the requirements of space travel and re-entry?

Honestly, I'd think that masking the atmospheric thermal bloom and wake turbulence would be the major problem, unless they have so much thrust and endurance that they can pretty much stop in the upper atmosphere and drop down at a controlled pace.
 

Sounds great and points out the complexity of what they are doing.

I'm definitely not going to second guess their large team of genius engineers by uttering silly criticisms since I have no depth of knowledge of this subject. I don't even know what I don't know.

I trust Elon and the Tesla engineers. They conceived of and designed a car that absolutely blows me away every time I drive it. 43,000 miles and I am still like a little kid. Based on my past experiences with Tesla's innovative technology, this will be great. Yes it will require responsibility on our part to follow the directions.

I have trouble following the logic in several of the comments above. For example by explaining crash mitigation capability Tesla is not saying that the driver does not need to be paying attention. That is a leap of logic akin to saying " since airbags reduce injury, drivers can now drive safely with their eyes closed."

The term autopilot has never implied fully autonomous driving. Even the original use of the word in 1958 was only describing basic cruise control. Tesla repeated points out that they are a ways away from fully autonomous driving.