Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Is the Forward-facing radar be sufficient to self-driving car?

This site may earn commission on affiliate links.
Volvo and Audi/VW I believe are headed toward 360 camera coverage, 360 radar coverage, 360 ultrasonics, as well as LIDAR in the front.


FWIW there is a common fallacy in sensor fusion designs that "more sensors = better decisions", which is not really supported by history or the engineering complexity involved.

If you really get into a situation where radar says "OMG THERE IS A GIANT CAR RIGHT NEXT TO YOU" and camera says it sees nothing, and LIDAR says it's seeing some debris way off in the distance on the ground.... what does a sensor fusion system do?

Unfortunately too many mixed signals is not a formula for better decision making strategy. In order for these sensors to combine to result in a better decision, there must heuristics to prefer a subsystem over another for a given condition.... and those decisions can be prone to error too.

Audi, Volvo, and VW already put more sensors on their cars (as does Mercedes) compared to Tesla for their L2 ADAS systems, yet none of them perform nearly as well as Tesla's.


I'm not sure if Tesla has a viable strategy for FSD with 8 cameras and 1 radar, but if you rewind time a year, I also said that there's no way Tesla can make a lane keeping system with 1 camera and 1 radar when Mercedes had a 5 radar and stereo camera solution that performed terribly.
 
FWIW there is a common fallacy in sensor fusion designs that "more sensors = better decisions", which is not really supported by history or the engineering complexity involved.

Fair enough. I can see your point. However, that does not mean what Tesla has is the optimal setup. It seems unlikely IMO.

Audi, Volvo, and VW already put more sensors on their cars (as does Mercedes) compared to Tesla for their L2 ADAS systems, yet none of them perform nearly as well as Tesla's.

That is true for lane keeping. I would say for everything else they have are more robust, like blind spot detection and forward collision detection, compared to AP1. For example my former Audi A8 had two radars and a camera at the front simply for adaptive cruise and forward collission. And its second forward camera, night-vision cleaned constantly by a separate water spray directed at it, could audibly warn of pedestrians even in very dark. And this was like 5 years before AP1. Surely overengineered hardware-wise, but it was superb in seeing even stopped cars and stopping in time. Same with its two rear radars seeign cars in the blind spot from a long time away. Limited functionality, due to the very, very conservative software strategy, but very robust.

Lane keeping on the Germans has so far been unsatisfactory, that is absolutely true.

I'm not sure if Tesla has a viable strategy for FSD with 8 cameras and 1 radar, but if you rewind time a year, I also said that there's no way Tesla can make a lane keeping system with 1 camera and 1 radar when Mercedes had a 5 radar and stereo camera solution that performed terribly.

I agree Tesla certainly made an excellent lane keeping system with the 1 camera and 1 radar - and I would say that is testament to their software prowess and continuous software update strategy. And their cloud service, let's not forget that, it is important. But in that case the one camera certainly is sufficient for motorway lane keeping in most weather. FSD is much harder.

That software prowsess and strategy is still why I fully expect AP2 to be the first on market to do FSD in many scenarios, though. You simply can not beat constantly improving software and a crazy bold approach - unless the latter fails on its own, of course, which I do not expect it will. Already with AP1 Tesla was clearly pushing MobilEye's technology far beyond the conservative comfort zone of the latter and was getting results beyond what others could and would.
 
First time poster, so take it easy on me.

From reading this thread, it appears that what is not being said is that cameras have the ability to "see" a lot better than humans, especially in low light situations. I have no idea if idar or radar would work better than cameras or are needed in addition to cameras, but cameras should be able to "see" a lot better than your typical human. So, if the issue is really about does a camera "see" better than a human, I'd think the AP2 will be well beyond the abilities of humans now, or certainly in the future with cameras. Is that good enough, again, I have no idea, but it should be better than even a good driver now.
 
First time poster, so take it easy on me.

From reading this thread, it appears that what is not being said is that cameras have the ability to "see" a lot better than humans, especially in low light situations. I have no idea if idar or radar would work better than cameras or are needed in addition to cameras, but cameras should be able to "see" a lot better than your typical human. So, if the issue is really about does a camera "see" better than a human, I'd think the AP2 will be well beyond the abilities of humans now, or certainly in the future with cameras. Is that good enough, again, I have no idea, but it should be better than even a good driver now.

I think most of us believe the cameras can see better than humans and can cover the entire surroundings of the car at a time, unlike humans, so that part holds great promise.

But humans have more eyes and hands so to speak, for those tough moments. All it takes for the camera system to be out is one side camera partially covered by mud, one small spot on the surface of the car. There is no redundancy for that.
 
First time poster, so take it easy on me.

From reading this thread, it appears that what is not being said is that cameras have the ability to "see" a lot better than humans, especially in low light situations. I have no idea if idar or radar would work better than cameras or are needed in addition to cameras, but cameras should be able to "see" a lot better than your typical human. So, if the issue is really about does a camera "see" better than a human, I'd think the AP2 will be well beyond the abilities of humans now, or certainly in the future with cameras. Is that good enough, again, I have no idea, but it should be better than even a good driver now.
I had a little fun with this thread but essentially made the same basic comments. Some posters here seem to think their expertise is greater than Musk and assert that he is wrong -- or lying -- when he says the current hardware will be sufficient for self-driving. You decide who you want to believe. Within a year or two, we should have a better idea who is right. Ironically our states allow people to drive who have terrible vision, yet debate how perfect self-driving must be to get regulatory approval. I'll never forget the time I observed a driver's license examiner in Tucson pass a person who failed the eye exam machine after several tries by asking her to look across the street and read a large billboard!
 
I had a little fun with this thread but essentially made the same basic comments. Some posters here seem to think their expertise is greater than Musk and assert that he is wrong -- or lying -- when he says the current hardware will be sufficient for self-driving. You decide who you want to believe. Within a year or two, we should have a better idea who is right. Ironically our states allow people to drive who have terrible vision, yet debate how perfect self-driving must be to get regulatory approval. I'll never forget the time I observed a driver's license examiner in Tucson pass a person who failed the eye exam machine after several tries by asking her to look across the street and read a large billboard!

I am yet to see anyone on this thread who doesn't believe AP2 can self-drive. We have all seen the video, right? With a passive drive sitting in the car for regulations/safety and a sufficiently good weather conditions, I would say people are confident it will. I know I am. It is the best product to buy at the moment for this purpose, obviously.

What I have seen questioned is whether or not this level of sensors are sufficient to handle self-driving without any humans on-board, bad weather and wider global regulations (at least whole USA and EU for example). And will 8 cameras + 1 radar + ultrasonics eventually be sufficient sensor suite or will the industry (Tesla included) gravitate towards more sensors before crossing that final self-driving threshold...

But yes, in 1-2 years AP2 should be driving itself. Will it do so in inclement weather without any human interaction or even any humans on-board is another question.

(Then again, AP1 never reached all those promises Elon made about traffic lights, meeting you at the curb summon and entry to exit. Maybe some healthy sceptimism with Elon's promises would be warranted.)
 
Last edited:
That is true for lane keeping. I would say for everything else they have are more robust, like blind spot detection and forward collision detection, compared to AP1. For example my former Audi A8 had two radars and a camera at the front simply for adaptive cruise and forward collission. And its second forward camera, night-vision cleaned constantly by a separate water spray directed at it, could audibly warn of pedestrians even in very dark. And this was like 5 years before AP1. Surely overengineered hardware-wise, but it was superb in seeing even stopped cars and stopping in time. Same with its two rear radars seeign cars in the blind spot from a long time away. Limited functionality, due to the very, very conservative software strategy, but very robust.

FWIW I'm an Audi guy too -- my last car was a 2014 A6 with night vision and ACC, and it uses the same subsystem derived from your A8.

While I agree it had a better strategy for cleaning the night vision camera, the rest left a lot to be desired:

- The night vision camera warns for pedestrians but does not work together with the lane recognition camera. I would get frequent false alarms for pedestrian detection when the road bends, and the car always assumes I will drive straight off the road and ram pedestrians far before the lane even starts to curve.
- Even with 2 radar sensors, cut-in and cut-out detection was very poor. If a car suddenly moves into your lane and cuts you off, it takes about a second before your car compensates, and when it did, it was usually via an abrupt braking action.
- The behavior for approaching nearly-stopped traffic at 80mph was awful. The car would not initiate braking until ~100-150ft away, and by then it is beyond the ACC allowed braking force and you just get a forward collision warning and have to slam on the brakes yourself. Scares the crap out of your passengers.

I do agree that Audi did a lot of things right, but in terms of their ACC implementation, regardless of whether you measure by lane keeping or throttle/brake management, there was little evidence that 2 radar sensors is better than the 1 that AP1 has.


I do agree, though, that the blind spot radar system works far far better than Tesla's blind spot ultrasonic detection system. That's one of the features I really miss from my A6.
 
@chillaban

Sounds about right. And I agree the intergration of the systems was limited - as you say the night-vision was separate from the rest of the systems. Radars had some involvement with camera and navigation and crash prediction, but in the end it was more like a bunch of separate systems - and they never improve during the ownership of the car.

As for the ACC, I did consider it very good, but then at the time I have been coming through two, three generations of previous Audi ACCs and the A8 one was a revelation compared so that was my frame of reference at the time. I agree it was not perfect of course. Mind you, this was in a 2010 model or something like that so quite old now. Interesting to see what the new A8 launching this year is like, supposed to have Level 3 at least under certain speed.

Anyway, the very conservative software and integration policy, combined with a rather aggressive hardware policy, means the traditional premium drive aids have had tons on sensory, but have used them in a very limited scope.

Tesla is far more aggressive in software, and far more conservative in sensor hardware, and is getting more results with less.
 
Last edited:
  • Like
Reactions: 22522 and chillaban