Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Camera Replacing Radar [when will tesla vision be activated on cars with radar?]

This site may earn commission on affiliate links.
My possible scenario is that Tesla has the system log both a vision only and a vision + radar view, then compare how the software handles each. According to Tesla, the problem with radar is that it can generate a lot of false positives in many situations (causing things like phantom braking) but it would be interesting to see if radar does help in specific scenarios.
 
My Model 3 is a 2018 AWD Dual Motor with Enhanced Auto Pilot and system 2.5. Will my car continue with its radar + cameras, or will there be a recall or something to disable the radar and enable all the cameras? Thanks.

Absolutely no one knows, and anyone saying they know either works for tesla or is simply speculating.
 
  • Like
Reactions: Pemple
It will simply be a software update.

I'm almost positive it will. They will not want to maintain two NN code bases (one for cameras and one for radar).

@RTPEV:

Where are you getting your information from? To my knowledge, there's been no official or definitive word on whether or not Tesla will *ever* disable radar units or stop using the data from them on any vehicle already equipped with radar. In fact, two different Tesla customer service chat agents (admittedly, not a great source, but something) have told me they wouldn't disable radar on vehicles that already have them.

Admittedly, since Tesla literally disbanded their public relations office, it's hard for reporters or anyone to actually get information out of them, but still, there's no reporting I'm aware of that discusses this.

Can you provide sources? Links?

R,
Bill
 
  • Disagree
Reactions: lUtriaNt
@RTPEV:

Where are you getting your information from? To my knowledge, there's been no official or definitive word on whether or not Tesla will *ever* disable radar units or stop using the data from them on any vehicle already equipped with radar. In fact, two different Tesla customer service chat agents (admittedly, not a great source, but something) have told me they wouldn't disable radar on vehicles that already have them.

Admittedly, since Tesla literally disbanded their public relations office, it's hard for reporters or anyone to actually get information out of them, but still, there's no reporting I'm aware of that discusses this.

Can you provide sources? Links?

R,
Bill
I don't claim to have authoritative information, but this is just common sense. There would be no need to physically bring a car with radar in to physically disable or remove the radar transceivers--this can effectively be done through software, so if they are are in fact going to move to a vision-only solution on those cars, I'm almost certain they would simply do that via a software update.

And that takes us to the second comment that you quoted: whether or not they will actually disable radar on radar-equipped vehicles. I'm slightly less certain on this point, but again, it's common sense. If Tesla feels that they are able to achieve the same level of function/safety with vision-only systems as radar-equipped systems, why would they continue to maintain a separate neural net that is processing radar inputs in addition to vision inputs? This means that every time they want to retrain the net, they now have to do it twice. We are talking massive amounts of computing required to train the net. It would be really foolish of Tesla to continue to maintain a radar-equipped NN when the vision-only NN is capable of delivering the same or better results.

As for the reports from the Tesla CSRs, I think you said it best yourself...not a great source. In fact, they usually seem to say the exact opposite of what the actual truth is, so I'll use your report as my source! ;)
 
I'm almost positive it will. They will not want to maintain two NN code bases (one for cameras and one for radar).



They're already stuck doing that.

The OP has HW2.5.

It can't run the newer, larger, NNs.

I suspect the 2.x codebase is done getting significant updates and they're not going to ever replace it with a vision-only setup. No radar-less cars with 2.x were ever made anyway.


That said- the 3.x HW and higher cars likely WILL all go to vision only (and Tesla has expressly said vision only is the future plan, saying so as early as 2018) eventually.

But today vision only remains less capable, so it's not time yet.

When you see them remove the restrictions on vision cars compared to radar cars, then it might be getting close to time.
 
Got it. I guess I missed the point that we weren't talking about FSD and only AP/EAP. I agree with what you're saying for HW 2.5 systems. Those systems obviously will have a separate "codebase" that will likely not see the same level of development as the HW3 stack which will be driving towards autonomous driving capability.
 
I don't claim to have authoritative information, but this is just common sense. There would be no need to physically bring a car with radar in to physically disable or remove the radar transceivers--this can effectively be done through software, so if they are are in fact going to move to a vision-only solution on those cars, I'm almost certain they would simply do that via a software update.

And that takes us to the second comment that you quoted: whether or not they will actually disable radar on radar-equipped vehicles. I'm slightly less certain on this point, but again, it's common sense. If Tesla feels that they are able to achieve the same level of function/safety with vision-only systems as radar-equipped systems, why would they continue to maintain a separate neural net that is processing radar inputs in addition to vision inputs? This means that every time they want to retrain the net, they now have to do it twice. We are talking massive amounts of computing required to train the net. It would be really foolish of Tesla to continue to maintain a radar-equipped NN when the vision-only NN is capable of delivering the same or better results.

As for the reports from the Tesla CSRs, I think you said it best yourself...not a great source. In fact, they usually seem to say the exact opposite of what the actual truth is, so I'll use your report as my source! ;)

To be clear, you've said nothing so far that establishes that Tesla actually plans to stop using radar inputs on vehicles already equipped with radar, even as "Tesla Vision" rolls out - assuming that's really a thing and not just incremental updates pitched as "Tesla Vision!"

They've said no such thing about not using radar on cars that already have it installed. And for a lot of reasons we can get into, that might actually be a stupid move on their part.

Remember that the driver-assistance features in Tesla's firmware already allow for different vehicles having different combinations of sensors (e.g., Older Model S and X vehicles have different sensor suites than current ones) or for certain sensors being broken and compensating for it in many situations, or at least alerting the driver a sensor is blocked or broken. That's the other side of the coin for this discussion. We presume the firmware is no longer "polling" for the radar units on new Model 3/Y vehicles that don't have radar, but who knows for sure?

Moreover, there are legal implications for selling vehicles that were advertised as having certain features and shipped with those and then...just not using them. In fact, that's why Tesla had to legally contact all customers who had ordered Model 3/Y vehicles before the change was announced and let them know radar units were being left out of their ordered vehicles, giving them a chance to cancel their orders.

In addition, tons of writing by experts who are REALLY skeptical about removing radar in terms of system safety and performance in all-weather, night, etc.

By the way, as aside, Tesla throws around the term "neural net" processing, but their cars don't actually do "neural net" processing in real time. They may use some heuristics-derived rules based on inputs from millions of their vehicles for TRAINING the rules, but that's not the same thing as the cars all being part of a living, "neural net"...it's a term they've been abusing for years now to sell cars. Your Tesla is computing alone, by itself, when it drives. It's not part of a "net" -- neural or otherwise.

In short, you're really just speculating that Tesla will stop polling the radar units or using them on vehicles so equipped, but there's little in the way of evidence, statements, or actions on the part of Tesla that indicate they plan to do so.
 
Last edited:
  • Funny
Reactions: lUtriaNt
To be clear, you've said nothing so far that establishes that Tesla actually plans to stop using radar inputs on vehicles already equipped with radar, even as "Tesla Vision" rolls out - assuming that's really a thing and not just incremental updates pitched as "Tesla Vision!"

What do you think vision-only cars are running on if Tesla vision is "not a thing"?


They've said no such thing about not using radar on cars that already have it installed.

Yes, they have actually.




Later they mentioned the long term plan is to move all functions, not just city streets, to the same code base that FSDBeta is using.

Pure vision- no radar.


Remember that the driver-assistance features in Tesla's firmware already allow for different vehicles having different combinations of sensors (e.g., Older Model S and X vehicles have different sensor suites than current ones)

The older S/X cars (and VERY few relative to total fleet size) have a different color filter on the cameras....they're otherwise literally the same cameras.

The radar (again on a tiny fraction of the fleet, and S/X only) is also different, but the difference is quite small (I think like 10 meters longer range maybe on the newer one?) and if they're removing radar entirely obviously that won't matter at all.



We presume the firmware is no longer "polling" for the radar units on new Model 3/Y vehicles that don't have radar, but who knows for sure?

Nearly everyone?

Green has discussed this on twitter multiple times.

Each car has a config file that tells the software what car it's running on. No-radar cars are flagged that way, it knows there's nothing to "poll" for.


Currently cars WITH radar still use it in the production highway AP code- because the highway AP code base is still the old one built around sensor fusion... and the vision version for that purpose isn't quite at parity yet... (hence why vision only cars have a lower max speed and a longer minimum follow distance).



Now, of course they might end up running into some unforeseen technical problem where they CAN'T get hwy capabilities to parity with vision only-- but it's been their announced plan for several years to do it, so they'd have some issues if they can't.


Moreover, there are legal implications for selling vehicles that were advertised as having certain features and shipped with those and then...just not using them. In fact, that's why Tesla had to legally contact all customers who had ordered Model 3/Y vehicles before the change was announced and let them know radar units were being left out of their ordered vehicles, giving them a chance to cancel their orders.

You're confusing HW and SW.

When the people ordered, radar was part of the listed HW- so you had to agree you'd be ok accepting a car that lacked it.

But AP and FSD are software features.... they're enabled via hardware, but as long as Tesla delivers the promised capabilities it's not a problem if they change HOW they do it...including ceasing to regard radar input.

That's likely why they're still keeping radar "on" for the cars with it before they reach feature parity... they'd have an issue if they suddenly reduced existing owners top speed and forced increased follow distances as that'd be taking away actual features they have today.




In addition, tons of writing by experts who are REALLY skeptical about removing radar in terms of system safety and performance in all-weather, night, etc.

There's tons of "experts" who insist this is impossible to do with LIDAR too.

Tesla, and many other "experts", think they're wrong.

Since nobody actually has a real L5 system today nobody really knows which experts are right or not.


By the way, as aside, Tesla throws around the term "neural net" processing, but their cars don't actually do "neural net" processing in real time.


This is entirely false.

They actively run neural nets- many of them- live on the car.... it's the primary way perception is handled in the vehicle.

Most planning and control code is still traditional C.... but even some planning has no starting using NNs in the car and they intend to move increasingly more of it in that direction.



Your Tesla is computing alone, by itself, when it drives. It's not part of a "net" -- neural or otherwise.

Ah... you have no idea what neural net means, gotcha.

They have no need to "talk to other cars" to run neural networks.
 
I have to say that your reply has a strong fanboi-infused tone, IMO. A few thoughts in response, nonetheless:

First of all, nothing you posted proves "Tesla Vision" is going to stop using radar on vehicles that still have it. It's so-named because it's CAPABLE of functioning -- at some TBD level of performance relative to multi-sensor systems -- on the camera-only vehicles that Tesla started putting out. It's as much a marketing and PR move as it is rooted in any actual technological advance. They just decided to come up with a new marketing name to make what is now a large weakness sound like some sort of earth-shattering advantage and strength. Thus far, it isn't.

And the Elon tweet you posted doesn't mean what you claim it does. He's referring there to software that is capable of running on camera-only vehicles...because it HAS to. The (unsourced) statement that they intend to move "all functions" to the same "FSD code base" is fine, but it could just mean that they intend to *restore* all functions, including those recently handicapped by their radar decision, to the vision-only algorithms. I can read that tweet and the (unsourced) statement as meaning they would still retain radar-usage capability in the code. Besides, even if it means what you think it does (i.e., ignoring existing radar units), it doesn't matter what their "intent" is. Tesla has "intended" a lot of things for many years that haven't come to pass. I'll believe a camera-only system can do just as well, and just as reliably when I see it (more on the physics of this below...).

Secondly, there's very little evidence that "Tesla Vision" is performing on par with existing systems, and quite a bit of evidence to the contrary. You pointed out that Tesla had to limit previous features in the camera-only M3/Y. And there are reports of some sketchy behavior, at least in amateur testimonial videos, etc.

Personally, I get the sense that Tesla is scrambling to figure out how to handle what was likely an impulsive, supply-chain based decision on Elon's part by accelerating "camera only" techniques they were working on but didn't necessarily to field, much less *immediately* in response to the snap decision to stop installing radar.

Thirdly, you appear to be claiming Tesla always intended to go to vision-only and it just took a while. At least that's how I read your reply. If that's what you meant, it's kind of hilarious. Just two or three years ago, Elon had glowing tweets singing the praises of radar when combined with cameras, and talking about how it's really a necessity for low-light and poor-weather conditions and how you really need multiple sensors.

Fourth, and by far the most important group of objections I have is that you're just flatly misstating what the HUGE preponderance of experts believe about automotive sensor fusion and you're sort of ignoring points of physics, which are legion... ;^)

For example, there's NO amount of "camera only" processing that can see two (or even three!) cars ahead the way mmW automotive radar units can by making use of both direct radar returns through non-RF-reflective openings in a scene (e.g., glass, etc.) and by using multi-path, multi-bounce returns (e.g., beneath cars ahead to see *two* cars ahead and accurately represent distances and relative speeds). That's why camera-only M3/Y drivers posting online immediately noted less visualization of multiple cars ahead of them than they'd noted when driving other Teslas...the radar data to provide that is just gone. When a car moves in front of another one, the cameras just think it's gone. It disappears from view, and it's not at all clear the vehicle is keeping track of the existence of that car.

And tracking vehicles without the direct measurement of range and speed that one gets from radar is much harder. It doesn't matter how fancy Tesla claims they're getting with camera algorithms, they can't change the physics of direct measurement of these things or invent a way out of line-of-sight obscuration, which radar can sometimes deal with.

And collision warning and avoidance without direct measurement of range and speeds? Not to mention accurate adaptive cruise control? I mean...why would you throw that data overboard for little in the way of cost savings? I mean...this is like a big science experiment.

By the way, contrast the previous Tesla website, which used to have radar and talk about something like up to 500 or even 1,000 meter rage and poor weather with the new site. Now, it's got the camera-only representations with the pitch that it's "powerful visual processing out to a hundred meters" or something to that effect. Wow! All the way out to 100 meters, Elon? And "powerful" now? It's all marketing, IMO.

Similarly, the physics behind why radar and LIDAR can function in darkness (they're ACTIVE sensors...duh) and/or in poor weather isn't going to change because Tesla is trying to get fancier with cameras. Cameras are often blocked or obscured by rain, snow, or debris (and my Model Y frequently pops up warnings about blocked cameras in moderate rain). Similarly, LIDAR, as an active sensor, can form true 3D scene cube data (using "point cloud" data very rapidly) and process it in ways that cameras can only approximate, at best. There is just no range or speed data inherent in passive cameras...all the processing is synthetic attempts to glean such data, and they're all prone to errors.

As an aside, you sound like you on board with Musk's ridiculous dissing of LIDAR. He never seems to explain his reasons in any coherent way. He usually just blasts it in a statement to a reporter or a tweet without really ever explaining why (or says something that's not quite technically correct and moves on). I think someone just told him early on, years ago, that he didn't NEED to add LIDAR since he'd already put the radar in or he just wasn't as comfortable in his understanding of LIDAR, so it's "bad" in his mind. Plus, he HAS to make it sound as if every decision he's made is based on some grand understanding that everybody else simply lacks. Hence "LIDAR IS BAD BUT I DON'T REALLY HAVE TIME TO EXPLAIN WHY OR PROVIDE TECHNICALLY ACCURATE REASONS!"

Elon can't will away the physics of solar intrusion on sensors, or...you know...darkness, which is sort of associated with...you know...night. The dynamic range of cameras' focal plane sensors is what it is, and while improvements are always being made, it's really unlikely that inexpensive extremely low-light automotive-grade optical cameras are on the horizon that can do what active sensors can do at night or in poor weather where visible wavelengths of light just don't get very far. That's why numerous industry studies -- and European government and university studies -- show that camera-only automotive systems fare much more poorly in darkness and poor weather, particularly when pedestrians and cyclists are around. There's just no comparison. It's...*ahem*...literally like night vs. day. ;^)

Every other major automaker is continuing to rely on sensor fusion, particularly in the face of rapidly dropping unit costs of automative radar and LIDAR. Field tests by independent agencies, safety organizations, and university research groups all concur that camera-only systems generally fare worse than multi-sensor techniques. I won't go track down percentages, but I'll say you're just plain incorrect when you claim there are equal of numbers of experts saying the contrary. It's just not true.

I continue to believe that Tesla's counter-intuitive, contrarian direction, while edgy-seeming and cool to talk about (Neural nets! Powerful visual processing! ) is waaaaay outside the consensus opinion. There's near uniformity among technically savvy experts who've been doing automotive driver assistance research for decades that Tesla is making a risky move here without solid data to back it up. Do I have the exact numbers (e.g., 90% of experts? Whatever)? No, but the reporting is fairly consistent and the theme that emerges is quite clear.

I continue to believe this was a snap decision by Musk when faced with the threat of slowing his production line due to radar unit shortages and with the impending reduction of Government subsidies and carbon tax credits that will eat into his bottom line. They're trying to dig out of both a looming cost hole and supply chain over-reaction by Musk simultaneously.

R,
B
 
Last edited:
  • Disagree
Reactions: lUtriaNt and db93
I have to say that your reply has a strong fanboi-infused tone, IMO. A few thoughts in response, nonetheless:

First of all, nothing you posted proves "Tesla Vision" is going to stop using radar on vehicles that still have it.

I mean, I cited the CEO of the company saying the next-gen system is using vision ONLY, but if you won't accept proof as proof I can't help you.


And the Elon tweet you posted doesn't mean what you claim it does. He's referring there to software that is capable of running on camera-only vehicles...because it HAS to.

This is also wrong.

FSDBeta runs on radar AND no radar vehicles.

But without radar on either when running the city streets code.



The (unsourced) statement that they intend to move "all functions" to the same "FSD code base" is fine

Again this is factually wrong.

It's not unsourced- it's again from Elon Musk.


And again if you don't believe the CEO, I can't help you with "better" sources than that.


(editors note- it appears the target here has slipped at least slightly- as 10.1 doesn't use the single stack yet, could show up in 10.2, or could be longer- but the fact they intend to move everything to that stack is not, remotely, an "unsourced" statement.)


, but it could just mean that they intend to *restore* all functions, including those recently handicapped by their radar decision, to the vision-only algorithms. I can read that tweet and the (unsourced) statement as meaning they would still retain radar-usage capability in the code.

How you can read "vision only" and "pure vision" as "vision AND radar" is a bit of a mystery here.



Besides, even if it means what you think it does (i.e., ignoring existing radar units), it doesn't matter what their "intent" is. Tesla has "intended" a lot of things for many years that haven't come to pass. I'll believe a camera-only system can do just as well, and just as reliably when I see it (more on the physics of this below...).

I mean- I literally said this in the post you're quoting.

That is the current plan. It might or might work out. Previous plans when they haven't worked out has caused Tesla to realize it and change what they're doing.

(this is one of the aspects that makes Tesla so great- they don't keep making a mistake for many additional years because they spent so much time making it... they pivot out of the mistake as soon as they can upon realizing it is one).

We don't know if this will be one or not yet. We only know what they're currently attempting as their goal and how they're currently trying to get there.



Secondly, there's very little evidence that "Tesla Vision" is performing on par with existing systems, and quite a bit of evidence to the contrary. You pointed out that Tesla had to limit previous features in the camera-only M3/Y.


Yes. I did.

So it's weird you're trying to make a counter-point to my post by repeating what I already said.



Thirdly, you appear to be claiming Tesla always intended to go to vision-only and it just took a while. At least that's how I read your reply. If that's what you meant, it's kind of hilarious. Just two or three years ago, Elon had glowing tweets singing the praises of radar when combined with cameras, and talking about how it's really a necessity for low-light and poor-weather conditions and how you really need multiple sensors.


Uh.... nope.

3 years ago he was saying intent was vision only, and radar might be a backup to it.




I think you're stuck on what he said 5 years ago about Radar.


Again- great thing about Tesla- when they get new info, they change their mind.



For example, there's NO amount of "camera only" processing that can see two (or even three!) cars ahead the way mmW automotive radar units can by making use of both direct radar returns through non-RF-reflective openings in a scene (e.g., glass, etc.) and by using multi-path, multi-bounce returns (e.g., beneath cars ahead to see *two* cars ahead and accurately represent distances and relative speeds).

Of course humans can't see that way either- and seem to drive just fine.


Teslas point was that radar fusion overall makes the system worse.... not that there's NO situation where it can add value.

If it makes it worse 90% of the time and better 10% of the time, it might make sense to remove it. Which is what they've done.

Go re-watch AI day (or actually I think watch it for the first time since this seems like new info to you...), Karpathey spends a fair bit of time showing real-world examples of this and why they made this decision.



That's why camera-only M3/Y drivers posting online immediately noted less visualization of multiple cars ahead of them than they'd noted when driving other Teslas...the radar data to provide that is just gone. When a car moves in front of another one, the cameras just think it's gone. It disappears from view, and it's not at all clear the vehicle is keeping track of the existence of that car.

Again this was debunked on AI day showing the beta code keeping track of such cars just fine.... and its ability to do that was part of why they were confident with removing the radar.

So again you appear not to have watched this at all yet are trying to tell us how well it's working?



And tracking vehicles without the direct measurement of range and speed that one gets from radar is much harder. It doesn't matter how fancy Tesla claims they're getting with camera algorithms, they can't change the physics of direct measurement of these things or invent a way out of line-of-sight obscuration, which radar can sometimes deal with.

This again appears factually untrue.

They actually showed plots during AI day (yet more evidence you didn't even watch it) comparing them... vision managed to get quite close to radar even at what is still a relatively early stage.

There's many AI papers published in recent years showing vision can get within a couple percent accuracy to even LIDAR on this stuff using OLD code and cameras, let alone newer methods.

For driving, you don't need mm level distance, because you're never supposed to be remotely that close to things anyway.... nor do you need 100ths of a second speed accuracy for this task.




And collision warning and avoidance without direct measurement of range and speeds? Not to mention accurate adaptive cruise control?

Yup.

Science is pretty cool.


I mean- again HUMANS drive without radar... including maintaining follow distance, avoiding collisions, etc.

They do it with a mere two cameras of the same general focus and capabilities.... Tesla has 8 (including 3 facing forward)


I mean...why would you throw that data overboard for little in the way of cost savings?

Because it makes the overall system worse.

Go watch AI day and get back to us- it corrects a lot of things you're wrong about or not understanding well.



By the way, contrast the previous Tesla website, which used to have radar and talk about something like up to 500 or even 1,000 meter rage


No, they have not.

Again your basic facts are simply wrong.


The radar on the car has a max range of 160 meters....not 500-1000.

The cameras can see out as far as 250 meters.


and poor weather with the new site. Now, it's got the camera-only representations with the pitch that it's "powerful visual processing out to a hundred meters" or something to that effect. Wow! All the way out to 100 meters, Elon? And "powerful" now? It's all marketing, IMO.

No. 250. significantly more than the radar was capable of.


Tesla website you claim you have read but clearly have not said:
Eight surround cameras provide 360 degrees of visibility around the car at up to 250 meters of range


I'd again suggest you go go a lot more reading and get your basic facts corrected before trying to continue the discussion.




Similarly, the physics behind why radar and LIDAR can function in darkness (they're ACTIVE sensors...duh)

If only cars had, I dunno, external lights in the dark or something....


As an aside, you sound like you on board with Musk's ridiculous dissing of LIDAR. He never seems to explain his reasons in any coherent way.

Sure he does. You just don't understand them.


tl;dr version is if you get vision right, LIDAR can not provide any additional info you need to drive.

Solved vision already gets you all the speed and distance data you need to drive.

If you have NOT solved vision then LIDAR would simply be a crutch for a vision system that needs more work before it's safe.



That's why numerous industry studies -- and European government and university studies -- show that camera-only automotive systems fare much more poorly in darkness and poor weather, particularly when pedestrians and cyclists are around. There's just no comparison. It's...*ahem*...literally like night vs. day. ;^)

Do you have some links to such peer reviewed studies (not just PR releases from LIDAR companies) you could provide that directly compare vision only and LIDAR systems as you describe?


Every other major automaker is continuing to rely on sensor fusion

And continuing to offer significantly weaker actual features in their systems.

Fords "new" system famously turns itself off if your highway curves even slightly for example.


Ford’s advanced driver-assist suite could not take the off-ramp by itself. This made it quite different from the Autopilot that Munro has been using for some time now in his Tesla Model 3....
...But that’s just the tip of the iceberg. Munro soon learned that BlueCruise required manual interventions when navigating curves. And when these happened, the alerts prompting drivers to keep their hands on the wheel did not include loud, audible warnings. Manual interventions with BlueCruise were also required for basic maneuvers like lane changes


And again, that's their NEW system, not some years old one or something.

Less capable than the one Tesla was putting on cars generations of AP HW ago in 2014, let alone the current setup.

Even what is largely considered the "best" competitor (Supercruise) is significantly less capable overall (AND costs more too)




Field tests by independent agencies, safety organizations, and university research groups all concur that camera-only systems generally fare worse than multi-sensor techniques. I won't go track down percentages,


...or provide any actual sources to support your claims.


I wonder why? :)




Anyway- go watch AI day. Several times if needed. If you have any specific questions on stuff you didn't understand feel free to let us know.
 
  • Like
Reactions: WhiteWi
Wow...you're DEEP in Musk's grasp...

First of all, you repeatedly say things like "Elon said XXX so it's true." He can make whatever claims he wants, but he's been saying things for ten years that still haven't come to pass. He's been saying "full self driving" is "a few months away" for...years. It's a running joke.

Secondly, it's completely false that Musk has "always" said "vision only" was the goal and radar was a temporary backup. That's absolute, 100%, pure bull. There are tons of direct Musk quotes to the contrary. Here's one from Sep 2016:

"Radar sees through rain, fog, snow, dust, and essentially quite easily. So even if you are driving down the road and the visibility was very low and there was a big multi-car pileup or something like that and you cant’ see it, the radar would and it would initiate braking in time to avoid your car being added to the multi-car pileup. In fact, an additional level of sophistication – we are confident that we can use the radar to look beyond the car in front of you....so even if there’s something that was obscured directly both in vision and radar, we can use the bounce effect of the radar to look in front of that car and still brake. It takes things to another level of safety."

Thirdly, you talk about "solving" vision -- again parroting a Musk's talking point in his language -- as if it's established that there even is a "solution" to computer vision-only driving and it's just a matter of cranking a gazillion more cases and examples through Tesla's training algorithms.

The fact that people can drive with two eyes and without a radar/LIDAR in their heads is not an existence proof that cars can do it in a way that adds safety. People also use their hearing when driving in many situations; they also read expressions, intent, and hand-gestures from other drivers when making decisions and sometimes even speak to other drivers through open windows. If you're going to use Musk's "existence proof" argument -- which Facebook's CTO comically did when defending the radar removal -- then you have to acknowledge the "existence proof" involves people with two eyes, two ears, the emotional intelligence to read facial expressions, and understand gestures, etc. So by this logic, Tesla's system is deficient because it 's missing all the other features human drivers use.

Besides, if vision only vehicles need to disable features in snow or rain when a camera is blinded in fog -- which Teslas sometimes do -- has it added to safety under those conditions? It's the same as people needing to pull over in dense fog. You keep falling into the trap of thinking that driver assistance features only need do a few things as well as humans and we're good to go. "Full Self Driving!" Vision is "solved"...! :rolleyes:

Regarding LIDAR: Sure, if one asserts that "vision can be solved" and get you all the data you need, then of course you're going to claim that any other data source is redundant and not needed. That's Musk's -- and your -- circular argument. Go ahead and show me a quote where he explains an actual *technical deficiency* in LIDAR that's specific to the ways other automakers are trying to make use of it. And don't tell me I "don't understand" Musk's supposedly visionary LIDAR criticism. Show me examples of Musk explaining specific technical deficiencies of LIDAR and radar in actual automotive instantiations. Or you could, you know, go ahead and explain it to me yourself. Go ahead and assume that I'm a physicist at a large corporation working directly with teams working to use LIDAR to fly aircraft, UAVs, and other vehicles. Don't dumb it down for me. ;^)

Fourth, regarding other automakers' systems, you found one point example of some Ford features that Monro found inferior to Tesla's. Okay, if we're going to start a Google war of non-peer-reviewed YouTube videos and magazine articles, here's a counterexample:


Fifth, I don't need to put Tesla's "AI day" on loop and watch it all day. It's marketing, dude. Of course they're going to pitch all sort of amazing things they're supposedly doing and validate their current technology and decisions. It's like the booth at a trade show. Were those presentations "peer reviewed?" From what I read, Musk also reportedly joked around that a guy who walked on stage in a white jumpsuit and a helmet was an android that Tesla's working on that will be a fully autonomous AI with mobility, and half the fanboi-base in attendance still thinks he was serious.

Sixth, as you requested, some peer-reviewed research regarding the relative performance of camera-only systems vs. fusion systems:



This next one is a "review" style paper, but gives a pretty good sense of where things are heading, and the reference papers they list are pretty useful.


There's a ton of other work comparing all of this in IJVAS as well as in the Journal of Autonomous Vehicles and Systems. See if your workplace or university has subscriptions to either (or both) and set aside some time to read.

R,
Bill

 
Last edited:
Wow...you're DEEP in Musk's grasp...

First of all, you repeatedly say things like "Elon said XXX so it's true."

Given it's regarding things you claimed there was "no source" for--- and he's the source and I quoted him saying it- yes.

Why do you hate facts?

Your claim there was no source for these things is simply wrong. Be an adult, admit you were wrong, and move on.


Secondly, it's completely false that Musk has "always" said "vision only" was the goal and radar was a temporary backup. That's absolute, 100%, pure bull.


Yes- claiming I ever said "always" is 100% pure bull.

Why do you keep making claims that are 100% pure bull?

I said vision only was the claim 3 years ago

Hell you QUOTED ME SAYING THAT, then magically changed that to 'always' which I never said in your reply.

I even cited a source (you know- that thing you said didn't exist) from 3 years ago confirming what I actually said


What I actually said

3 years ago he was saying intent was vision only, and radar might be a backup to it.


And I even pointed out your claim appears to be relying on older, since updated, info-



I think you're stuck on what he said 5 years ago about Radar.

You then decided to ignore what was actually said and make up your own strawman to knock down.



If you can't argue honestly, maybe you shouldn't argue.


Most of the next bit of your post was just hand-waving FUD so I've skipped over that down to-


Okay, if we're going to start a Google war of non-peer-reviewed YouTube videos and magazine articles, here's a counterexample:


I guess, once again, you did not actually read beyond the headline

The actual driving system features and performance of Teslas system ranked #1 there.

Tesla scores #1 for ease of use, as well as.... capabilities and performance

You know- just like I told you and you, wrongly, thought your link disagreed with.

Caddy only "wins" because it has a bunch more "driver is an irresponsible idiot who can't be bothered to read a manual" nannies on it's less capable system.


Again- if you're not interested in an honest discussion, maybe sit out the discussion?



Fifth, I don't need to put Tesla's "AI day" on loop and watch it all day.

Watching it once would correct a lot of your inaccurate claims.

If you understood it and were genuinely interested in the truth anyway.... which seems increasingly in doubt.


It's marketing, dude

It REALLY was not.

More evidence you haven't ever seen it though.



From what I read, Musk also reportedly joked around that a guy who walked on stage in a white jumpsuit and a helmet

Ah, finally an outright admission you're making claims about something you've never seen.... accidental honesty there? :)


Sixth, as you requested, some peer-reviewed research regarding the relative performance of camera-only systems vs. fusion systems:

Sweet! Let's see how not-at-all you've read or understood some MORE stuff!




YOUR source said:
r. Lastly, the dataset can also be expanded by incorporating
a greater variety of frames and scenarios such as urban/city driving, pedestrians, road-side clutter and
adverse weather conditions.


So.... they used a pre-annotated data set-- of highway only driving-- with clear roads and perfect weather.

SHOCKING they got good results!

Tesla had good results in those conditions in 2014 in their production system.

YEARS more of actual read world data in all conditions taught them the disparate radar fusion data made things worse more often than they made them better.

Which- again- is explained in technical detail in the AI day stuff you can't be bothered to watch.




This one.... doesn't actually cover the topic you claim it does.

It largely compared "humans", "autonomous vehicles" and "connected autonomous vehicles"

at no point does it actually compare "camera only" AVs to "sensor fusion" AV performance. The nearest it comes is comparing a vehicle with EVERY sensor type to one with EVERY sensor type -and V2V- added. Which isn't remotely relevant to the topic... but it used the word "fusion" (fusing on-car and off-car info) so you thought it applied.

So again- a paper you didn't bother to actually read, let alone understand.


Hilariously, further down, the paper outright disagrees with you (the other time it comes even remotely near the actual topic)--- it mentions shortcomings of systems with single-camera systems, which can be solved with multiple cameras... (Tesla has 8).... and then mentions the problem with fusion

YOUR source said:
waiting for such sensor agreement can be problematic, especially when “complex or unusual shapes may delay or prevent the system from
classifying certain vehicles as targets/threats


Which is exactly the problem Tesla found in the real world

When radar and cameras disagree you have problems.

So they removed the vastly lower resolution sensor, and focuses on improving the much higher resolution one.




This next one is a "review" style paper, but gives a pretty good sense of where things are heading, and the reference papers they list are pretty useful.


They'd be more useful if you'd actually read them of course :)

Once again this source does not say what you claim it does.

It does not compare "vision only" to anything.

Instead it compares 3 different sensor-fusion methods (high, low, and mid level) and compares different fusion calibration techniques.

But hey, it has the word "fusion" in it so you thought it was relevant at a glance I guess without bothering to read or understand it.

Again.


The study DOES mention that a large, quality, robust, dataset is critical to getting self driving right though.

And Tesla, of course, has a vastly larger dataset than anyone else thanks to fleet collection.

Which suggests maybe they have some idea WTF they're doing if they make a change like this, and have data to support their doing it.

Another thing you'd be aware of if you'd actually watched AI day and could understand what is presented.
 
Last edited:
  • Like
Reactions: WhiteWi
Okay, last post because I just don't have time to debate this with you, and now you're just overtly lying about the body of peer-reviewed research out there. But you "WATCHED A.I. DAY!" so....yeah. :rolleyes:

Dude...I literally work with teams performing multi-sensor fusion projects of exactly this type for a major aerospace company. You might have missed that. I don't need to "WATCH A.I. DAY!"....particularly not the part with the guy in the white leotard who's supposed to foreshadow a Tesla's "android" concept. LOL.

Right now, multiple experts and knowledgable insiders say it's not happening. And if you don't trust them, there are even recognized gurus ON THIS FORUM who have decompiled the firmware, and say it isn't likely to happen based on what they're seeing. For example, go talk to user YieldFarmer, who has the entire code tree decompiled, talks to insiders (including the famed "Green" on Twitter), and even retrofitted a radar into his non-radar equipped Tesla and sees that Tesla's code is still pulling from the radar in the latest updates, which are supposedly "Tesla Vision!".

Moreover, Tesla Service Center managers say they have direction to maintain replacement radar units and supply chains for any that fail on existing vehicles with radar (call the Rockville, MD and Sterling, VA managers, for just two examples of people who have said this to me personally).

In addition, Tesla customer service agents are telling existing customer with radar units that they'll still provide data and be used.

But...sure....we'll await your proof when it's actually happened and I'll Venmo you $500. Until then, I'm not going to spend any more time on this thread.

R,
B
 
Last edited:
  • Disagree
Reactions: lUtriaNt and db93
Okay, last post because I just don't have time to debate this with you, and now you're just overtly lying about the body of peer-reviewed research out there.

Dude.

In all my replies I was quoting the research you insisted we should look at

Except, like AI day, you didn't actually look at it. You just assumed it supported you because of a few keywords- but when you actually read it, turns out it doesn't.

Not sure how you can say I'm "lying" by directly quoting YOUR sources.

Unless- again, you're replying without actually reading the stuff you're talking about.

On brand for you at least :)



Dude...I literally work with teams performing multi-sensor fusion projects of exactly this type for a major aerospace company. You might have missed that.

No, I actually read what people write.

I was just pointing out you apparently don't understand what those teams you "work with" (but aren't actually ON) do.

Because again your own sources contradict much of your claims.


I don't need to "WATCH A.I. DAY!"

This, like most of your claims, is obviously untrue.

You need to watch it pretty desperately because it dispels some of the false information you appear to have made up about how any of this works.

Hell, so do a few of the sources YOU posted, that you ALSO didn't actually read beyond checking they had the word "fusion" in them :)


Right now, multiple experts and knowledgable insiders say it's not happening.

When you keep being shown to be wrong, and your only counter evidence is to insist "some experts say..." that's a logical fallacy called "appeal to autority"

It's even less useful here since folks can cite multiple experts saying the opposite... and even some of YOUR OWN SOURCES contradict you.


And if you don't trust them, there are even recognized gurus ON THIS FORUM who have decompiled the firmware, and say it isn't likely to happen based on what they're seeing. For example, go talk to user YieldFarmer, who has the entire code tree decompiled, talks to insiders

You can't "decompile" a neural network.

That's not how any of that works.

Again you should stop talking about this stuff. Especially when the more you say the more obvious your ignorance is.

Nor is it even clear what the "it" in "it isn't likely to happen" is.

If you mean using vision to replace radar- it already happened- Tesla has a bunch of vision only cars on the road. Which are determining speed and distance without radar- and the same "experts" you mention have found they're doing a pretty good job of it already.



Moreover, Tesla Service Center managers say they have direction to maintain replacement radar units and supply chains for any that fail on existing vehicles with radar (call the Rockville, MD and Sterling, VA managers, for just two examples of people who have said this to me personally).

Oh...yes... the noted neural network expert "tesla service center manager"

You are hilarious

Now I kinda hope this ISN'T your last reply as you claim- you are comedy gold my dude.



But...sure....we'll await your proof when it's actually happened

Proof WHAT happened?

Nobody ever claimed existing cars with radar today had the units turned off today.

Are you again making up strawmen because they're the only thing you know how to knock down?



and I'll Venmo you $500

Like anyone believes you have $500 :)



. Until then, I'm not going to spend any more time on this thread.


Remarkable how often people claim this then keep posting...

Still-- Elon himself has said if you find yourself in a hole- stop digging.

Perhaps you finally realized his advice is worth listening to!
 
  • Like
Reactions: lUtriaNt