Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Sonar for side "view" in Autopilot 1.0 instead of cameras - cost choice?

This site may earn commission on affiliate links.
Some people have claimed that they believe that detection of tractor trailers in the adjacent lane is lacking precision due to the high clearance of trailers. I would think that - if this is in fact an issue - it could be programmed out of it with some logic even with the current hardware. IE "If the camera detects a cab in the next lane at the 2 o'clock position, look for a wonky sonar signature directly next to the car - that will be where the trailer is" - or it could keep an eye out for vehicles approaching from behind via the rear-view camera.

But I digress - why use sonar in the first place for the side view rather than initially choosing to use multiple cameras? Cost issue? Programming complexity which would have pushed out the delivery date of Autopilot 1.0 to an unacceptable timeframe?

From what we know the current hardware "brain" - Mobileye Eyeq3 is capable of processing input from multiple cameras, but Tesla chose not to set it up this way in 1.0

OR - is there in fact an operational advantage to sonar?

Cost savings seems a bit odd from my admittedly ignorant POV - digital camera sensors are already inexpensive. Eyeq3 is already capable of handling multiple camera inputs. So why, in Oct 2014, did they not just put a 360 camera view in the cars - perhaps in addition to the sonar? Even if it would take additional time to program the system they could have turned on the side cameras at a future software update.

It's such an obvious question that they must have discussed it extensively when choosing the initial hardware suite for 1.0.

Marketing a la Apple? Introduce new features step-wise to create a reason to upgrade to a new car in a few years?

Because if you think about it - a modular system (aka ability to swap in a better CPU later) with an over-built data bus and 360 degree cameras with swappable sensors and lenses might have made a 2014 Tesla the last car you ever purchase.

Mobileye's CEO claimed in early 2015 that the existing Eyeq3 used by Tesla since 2014 is only using 5% of its processing capability right now and was built with the ability to process input from multiple cameras.

The battery is already swappable when chemistry gets denser.

So if Tesla had baked in all the sensors and cameras you'll ever need for true self driving in version 1.0 you might not need another car purchase for 15 years or more.
 
Last edited:
Some people have claimed that they believe that detection of tractor trailers in the adjacent lane is lacking precision due to the high clearance of trailers. I would think that - if this is in fact an issue - it could be programmed out of it with some logic even with the current hardware. IE "If the camera detects a cab in the next lane at the 2 o'clock position, look for a wonky sonar signature directly next to the car - that will be where the trailer is" - or it could keep an eye out for vehicles approaching from behind via the rear-view camera.

But I digress - why use sonar in the first place for the side view rather than initially choosing to use multiple cameras? Cost issue? Programming complexity which would have pushed out the delivery date of Autopilot 1.0 to an unacceptable timeframe?

From what we know the current hardware "brain" - Mobileye Eyeq3 is capable of processing input from multiple cameras, but Tesla chose not to set it up this way in 1.0

OR - is there in fact an operational advantage to sonar?
For me, the sonar is inherently limited as installed on the S front and rear bumpers. If you look at the pattern of the sonar detection, there is a blind spot at close distance from the B pillar. A wide angle camera at each side mirror would be much better imo. I think camera for auto-parking will be more effective as it does not depend on obstruction detection. I like to see a 360 view.
 
Clearly they could have used multiple cameras in version 1.0. From Mobileye's website it says that even the *previous* generation EyeQ2 could handle two camera sensors simultaneously and Eyeq3 is said to be 6 times more powerful than that:

"The second generation Mobileye EyeQ2® is more powerful by a factor of 6 and support all the above algorithms and more on a single platform, and supports video input from two high-resolution image sensors, as well as video out capabilities with graphic overlay, In serial production since 2010.The third generation Mobileye EyeQ3® is more powerful by factor of 6 than Mobileye EyeQ2® and will allow processing of multiple high resolution sensors in parallel, resulting in range extension and enhanced features for the end customer."

Processing Platforms - Mobileye
 
Why would Tesla not want to encourage you to purchase a new car? After a while, a significant percentage of their sales needs to be to existing owners-- in fact, that is probably starting to be the case now.

Because if you think about it - a modular system (aka ability to swap in a better CPU later) with an over-built data bus and 360 degree cameras with swappable sensors and lenses might have made a 2014 Tesla the last car you ever purchase.
 
Why would Tesla not want to encourage you to purchase a new car? After a while, a significant percentage of their sales needs to be to existing owners-- in fact, that is probably starting to be the case now.
I don't know if replacing ones car every two years due to technology advancement is good practice, even if one can afford it. Tesla could champion the idea that your car can last at least 4 years while being reasonably up to date by offering incremental upgrades. In this manner, the resale value will not drop too drastically, and gain more customer loyalty who are concern with fast depreciation. Elon's environmental goal is to convert more ICE to EV.
 
The front camera is illuminated at night by the headlights. If the sides were cameras, they'd be blind on a dark stormy night. They would also be subject to rain splattering obscuring the view since they are down low and are not cleared by a wiper. Ever look at the rear camera after a hard rain? It's unusable unless you pull over, go out and wipe the lens off. Blind spot collision warning, lane changing, etc. would all be severely hampered at night and during rain if the sensors were cameras.
 
The front camera is illuminated at night by the headlights. If the sides were cameras, they'd be blind on a dark stormy night. They would also be subject to rain splattering obscuring the view since they are down low and are not cleared by a wiper. Ever look at the rear camera after a hard rain? It's unusable unless you pull over, go out and wipe the lens off. Blind spot collision warning, lane changing, etc. would all be severely hampered at night and during rain if the sensors were cameras.
My interest on side camera is for auto stall parking. For blind spot and lane changing, radar will be more suitable than sonar.