Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

What of the camera now that reliance is on radar?

This site may earn commission on affiliate links.
I'm wondering if I maybe misunderstood today's announcement but what I took away from it is that on an ongoing basis Autopilot will be much less reliant on the camera and much more reliant on radar.

This doesn't seem to jive with the recent discovery of the dual camera housing on the Model X's. Why would they start rolling out new parts to accommodate more cameras only to then push the heavy lifting over to radar?

Did I misunderstand the release?
 
  • Like
Reactions: krazineurons
They still need the camera to detect road signs, traffic lights and other objects which require a visual processor.

The current announcement didn't cover any of that support which would essentially make it Level 3 autonomous. The speculation (hope) is that all new cars would have necessary provision in the hardware to achieve all of it, or perhaps they decided to ditch it all together since no S models have reported the dual housing yet.
 
I'm hoping they still use the camera for more than just lanes and signs. For example the camera has the ability to map the drivable surface ahead with and without lane markings which could let AP know whether or not it would be safe, for example, to go into the shoulder to avoid a collision.
 
I'm wondering if I maybe misunderstood today's announcement but what I took away from it is that on an ongoing basis Autopilot will be much less reliant on the camera and much more reliant on radar.

This doesn't seem to jive with the recent discovery of the dual camera housing on the Model X's. Why would they start rolling out new parts to accommodate more cameras only to then push the heavy lifting over to radar?

Did I misunderstand the release?

No, you misunderstood the dual camera part. Every X from the initial founder's series deliveries in September of 2015 has had a dual camera housing with only one camera in it. There was no recent change to the Model X hardware that we know of - they have all had that equipment.

Our assumption had been that this was in preparation for a future retrofit, but it's possible that with the Mobileye breakup and the new radar first strategy that the plan they made those parts for is no longer relevant.
 
My business is in automation sensors so let me try to put some light on this and keep it low technically.

Previously the radar functioned as a supllimental sensor to the camera, meaning the camera would make a decision based on the data it received from its CCD and using the advisement of the radar and ultrasonics. Now the radar carries as much weight as the camera does, where previously it's input values may have been 10%.

Previously the cameras decision was the final decision so even if the Radar was sure a crash was coming the camera could overrule it. Now it can't. Both systems must agree that the situation is safe. Now this adds in the issue that radar gives a lot of false positives (think of it as the hypochondriac of sensors), this is where the geotagging comes in. It helps the radar to produce fewer false positives.
 
I suspect Tesla's switching to better radar processing had to do with the MobilEye termination in some way. MobilEye sells Tesla the camera and related system, so development of that system may be frozen at this point. Either Tesla already had the radar processing under development, which prompted the split with MobilEye, or MobilEye split with Tesla giving Tesla no other option but to improve the other components?

I also read a rumor that Tesla will be replacing MobilEye's technology with another similar one from Bosch.
 
A lot of the radar emphasis came to light following the Joshua Brown accident, which highlighted the limitations of relying solely or mostly on cameras. I think (pure speculation on my part) the greater emphasis on radar was already the plan and the accident simply accelerated public announcements of it. If Tesla isn't going to use Lidar (which Elon doesn't think is needed), they knew all along that they were going to have to make better use of radar.

I'm also skeptical about Mobileye's stated reasons for parting ways with Tesla: "simply providing technology and not being in control of how it is being used". No sane business walks away from a high-profile customer for that, especially when that is the norm for component providers. I think Mobileye walked away because they didn't want to appear to be supportive of moving away from a vision-centric approach and risking their other customers also going down that path. Plus, Tesla has been hiring lots of chip designers lately. They are very likely planning on making their own chips and replacing some of the externally-sourced IP with internal designs in the longer run. Knowing they would lose Tesla as a customer in the long run makes Mobileye's decision make even more sense.
 
AS Phoenixhawk said: Camera will be the primary mechanism to identify the lane markings. Remember, identifying the lane markings accurately is the key to steering. So Camera will continue to play its vital role in Auto Steering (not TACC). In addition now the radars will give key input in making sure collisions are avoided or reduced.
 
AS Phoenixhawk said: Camera will be the primary mechanism to identify the lane markings. Remember, identifying the lane markings accurately is the key to steering. So Camera will continue to play its vital role in Auto Steering (not TACC). In addition now the radars will give key input in making sure collisions are avoided or reduced.

This is where it gets interesting. If there are enough objects on the whitelist and the radar profiles and geotags are precise enough, in principle the car could navigate and stay in the lane without the camera, based entirely on the measured locations of each whitelist object and the radar returns it gets from them.

This certainly wasn't Tesla's initial intent in developing the list, and undoubtedly the camera will remain the principle tool for identifying path for now, but going forward I could easily see Tesla using the RNav from the whitelist as a double check, and possibly to fill in during inclement weather or when glare blinds the camera.
 
  • Like
Reactions: 1 person
Anytime you deal with machine automation the holy grail is to have multiple redundancies of sensor sources and having that from different types of sensors. If you can do something with a vision system, or with a laser scanner doing it with both is better. This just brings up the issue of what to do if the two disagree. So bringing in a third only makes sence. If the car could drive completely off the geotag data but also have a camera to drive from AND use radar to verify what the geotag and vision system data see...well that is just about perfect from a machine automation standpoint.
 
  • Like
Reactions: Topher
Anytime you deal with machine automation the holy grail is to have multiple redundancies of sensor sources and having that from different types of sensors. If you can do something with a vision system, or with a laser scanner doing it with both is better. This just brings up the issue of what to do if the two disagree. So bringing in a third only makes sence. If the car could drive completely off the geotag data but also have a camera to drive from AND use radar to verify what the geotag and vision system data see...well that is just about perfect from a machine automation standpoint.

Yup. And that's why I said this list is so interesting, beyond the crash safety implications. If Tesla chooses, they can build the dataset to have all three simultaneously with the current cars. No one else has that kind of dataset - or even the resources to build that dataset from in a reasonable timeframe.
 
  • Informative
Reactions: Snerruc