Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Coast to coast drive happening this year for all FSD Teslas!

This site may earn commission on affiliate links.
The additional neurons could a a separate NN, but that is only a logical distinction. My point being the incremental power usage is a non-factor.



Yah, for those that wanted auto-wipers, the plan is a fail. A rain sensor may have helped train the NN, but there is still the problem of what area the sensor sees vs the cameras.
If Tesla got rid of the brake pedal and made it a slider on the touch screen you would probably be arguing that it makes sense because the NN doesn't use the brake pedal. Haha.
 
If Tesla got rid of the brake pedal and made it a slider on the touch screen you would probably be arguing that it makes sense because the NN doesn't use the brake pedal. Haha.

If it were touch screen only, that would be lame. For wipers, you can push the button on the stalk, if you need a wipe. Are they so bad that the wipers didn't work when needed constantly?

Wipers
To perform a single wipe with the windshield wipers, press and immediately release the button on the end of the left-hand steering column lever.
 
If it were touch screen only, that would be lame. For wipers, you can push the button on the stalk, if you need a wipe. Are they so bad that the wipers didn't work when needed constantly?
It doesn't rain here so it's not a problem :p
They've certainly improved the algorithm but it's not as good as other cars. I do use the stalk and I agree that the interface isn't that bad. Just seems like a silly way to save a few bucks.
 
It doesn't rain here so it's not a problem :p
They've certainly improved the algorithm but it's not as good as other cars. I do use the stalk and I agree that the interface isn't that bad. Just seems like a silly way to save a few bucks.
:)
At volume, everything matters (300k cars * $10 (sensor + connectors + harness + assembly + R&D) = $3 million). Just dealing with an additional parts supplier takes resources. (spoken as a former Tier 1 employee).
 
During the shareholder meeting yesterday, Elon said that he is running the latest dev software on his car
and it is able to self-drive him from his home to the main office,
but there a couple disengagements from time to time, especially at intersections.
So I would say the coast to coast demo is probably not quite ready yet.
Luckily, Tesla still has 6 months before their self-imposed deadline of the end of the year to get it right.
How thrusly FSD is able to detect and handle traffic light and stop signs?
 
Something I'll point out again that I find funny about this. During autonomy day, someone asked Elon if Tesla could use a Neural Network to do some task, and Elon without a shred of irony said that using a NN for some simplistic thing would be needlessly complex. I think his words were nuclear weapons to kill a fly when a fly swatter will do.

Rain sensors are fly swatters. Using a NN to detect rain was foolish no matter what the purpose was for doing it. Existing sensors work very well with many types of precipitation and smudges. Use the fly swatter.

With regard to the power requirement, I agree that the energy consumption is a non-issue. But the utilization of the NN is an issue, because that's neurons and cycles that could be used to improve image processing. Basically, the computing power is a finite resource, and any resources that can be freed to do actually useful work is a resource well utilized.

The human can turn on the wipers themself.

This is really the crux of the issue, IMO. The interface to set the wiper frequency is abysmal because Tesla intended the wipers to work in automatic mode. Because they intended auto mode to be the primary mode, it is very literally dangerous to try to change the wiper mode on the screen when weather turns quickly. I'm sure there are places in the world where weather doesn't deteriorate rapidly, but I don't live in one of those places. Most people in the world don't live in California, basically.

The solution to this problem is either to make auto wiper mode work more reliably, or to change the interface. I personally would love multiple presses of the wipe button to cycle through the modes. But Tesla hasn't added this feature because still, even after all this time, they refuse to accept the empirical data showing that auto mode based on a neural network and cameras does not work.
 
Something I'll point out again that I find funny about this. During autonomy day, someone asked Elon if Tesla could use a Neural Network to do some task, and Elon without a shred of irony said that using a NN for some simplistic thing would be needlessly complex. I think his words were nuclear weapons to kill a fly when a fly swatter will do.

Rain sensors are fly swatters. Using a NN to detect rain was foolish no matter what the purpose was for doing it. Existing sensors work very well with many types of precipitation and smudges. Use the fly swatter.

With regard to the power requirement, I agree that the energy consumption is a non-issue. But the utilization of the NN is an issue, because that's neurons and cycles that could be used to improve image processing. Basically, the computing power is a finite resource, and any resources that can be freed to do actually useful work is a resource well utilized.



This is really the crux of the issue, IMO. The interface to set the wiper frequency is abysmal because Tesla intended the wipers to work in automatic mode. Because they intended auto mode to be the primary mode, it is very literally dangerous to try to change the wiper mode on the screen when weather turns quickly. I'm sure there are places in the world where weather doesn't deteriorate rapidly, but I don't live in one of those places. Most people in the world don't live in California, basically.

The solution to this problem is either to make auto wiper mode work more reliably, or to change the interface. I personally would love multiple presses of the wipe button to cycle through the modes. But Tesla hasn't added this feature because still, even after all this time, they refuse to accept the empirical data showing that auto mode based on a neural network and cameras does not work.

If you ever watch Good Eats you'll see Alton Brown saying repeatedly almost no tool in your kitchen should be a unitasker (he does make a couple of exceptions).

My assumption is that Elon allowed the rain sensor deletion for the same reason.

Your example about the nuclear weapons vs flyswatter comment is probably an attitude he reserves for unitaskers. Using the forward cameras for rain sensing is a multitasking approach.
 
If it were touch screen only, that would be lame. For wipers, you can push the button on the stalk, if you need a wipe. Are they so bad that the wipers didn't work when needed constantly?

They are exactly that bad, yes. Pushing for one-time wipe does nothing when the rain is heavier than a mist. During day time, the quality ranges from doesn't wipe during misting rain to doesn't wipe in a light rain, and even wipes aggressively at the first sign of moisture.

At night, on the other hand, it's a whole different ball of wax. The effectiveness ranges from doesn't wipe unless there's a monsoon to randomly wipes in medium rain or heavier. They do not work in misting, light, or medium rain. So, basically if the rain is heavy enough to obscure your vision in 2 second or longer, you can not rely on the auto mode at all at night. No matter how much rain has collected on your windscreen.

Here's the result of auto mode on a misting rain evening. Even with the light of tail lights from other cars in front of me, the view is already obscured and gets significantly worse before the wipers even consider wiping.

IMG_20190507_210301.jpg

And here's a backup camera treated with a hydrophobic coating like Elon says will totally prevent water from building up. That also doesn't work, and it doesn't prevent ice from forming over the radar either.

IMG_20190507_203356.jpg

:)
At volume, everything matters (300k cars * $10 (sensor + connectors + harness + assembly + R&D) = $3 million). Just dealing with an additional parts supplier takes resources. (spoken as a former Tier 1 employee).

At volume everything matters is correct. But you're not including volume discounts, and you've made up the cost of a rain sensor. There are sensors commercially available in quantities under 1,000 that cost less than $0.50 each, and they come as a full solder-on package rather than needing to purchase (significantly cheaper) IR diodes and measuring the intensity. So in Tesla's case since they're making their own camera housing boards, it would have cost them less than a penny per car. It's literally just two IR diodes pressed up against the window detecting a decrease of intensity when a rain drop rolls over the windscreen. All of those other costs you've accounted for are unnecessary, because they're already building the housing.

Tesla messed up. They got too cleaver, they had no idea how hard it would be to build, tailor, and train a series of neural networks, and this is just one point of evidence.
 
If you ever watch Good Eats you'll see Alton Brown saying repeatedly almost no tool in your kitchen should be a unitasker (he does make a couple of exceptions).

The difference is when a tool does a task exceptionally well if not perfectly. If you need to cut vegetables extremely thin, quickly, and consistently, you could cut by hand and get inconsistent results or poor results based on who is cutting. Or you can buy a mandolin and get the right result 100% of the time.

My assumption is that Elon allowed the rain sensor deletion for the same reason.

That's definitely not why. He did it because he was holding a hammer and everything in the world looked like nails. He's done this repeatedly, and eventually comes around. Some day he'll make an announcement like, "Who knew manufacturing, intermodal transportation, and not using a rain sensor would be so hard?". To which everyone in those areas of expertise will facepalm and in unison yell "EVERYONE BUT YOU!"

Your example about the nuclear weapons vs flyswatter comment is probably an attitude he reserves for unitaskers. Using the forward cameras for rain sensing is a multitasking approach.

The example was about a unitasker. By the way, based on your "unitasker" rule, Tesla would just continue using a GPU for their neural networks rather than building a customized, perfectly tailored solution. In electronics, a purpose built solution will always outperform a general purpose solution in all ways except cost. Once you hit an inflection point at volume, the cost equation swings sharply in the favor of purpose built components. This is how the entire computer industry works.
 
  • Like
Reactions: Matias
@tomc603
I'm not saying the Tesla system works well now. Also not saying the removal of the stalk controls at this point was a good UI move (I pretty much lack depth perception, so pushing buttons on a flat screen without a physical reference is problematic to begin with). What I am saying is that the output from a rain sensor is insufficient for the needs of AP camara windshield cleaning.

Rain sensor typically use an IR emitter and an IR detector in a specific physical arrangement. Rain sensor - Wikipedia

Tesla could have integrated that into the tri camera system where the housing attaches to the glass; however it would not cover the area used by the camera, nor detect non-liquid blockages.

If you've ever had a passenger ask you to run the washer/ wipers, then you know the reason a rain sensor does not work for keeping AP cameras happy.

If you've ever had to run the washer, you know why a standard rain detector is insufficient to keep a vision system happy.

If you own a car with a standard rain sensor, you may have anecdotes about it not doing the right thing also (ours ran the wipers on a bone dry windshield).

Rear camera occlusion detection has no cleaning device to trigger, but it can prevent the car from reversing on its own. Likewise side repeaters/ B pillar vs changing lanes.
 
@mongo I for one do get the point why NN weather detection can be useful. But I do think on this one Tesla would have been better off leaving the rain sensor out only after the NNs work well.

One of the biggest reasons I think so is I’m not sure anyone has proven yet camera-based weather sensing can be made reliably in the dark. If nothing else, as an active IR light source, a dedicated rain sensor would give the NNs a redundant data point in the darkness.
 
@mongo I for one do get the point why NN weather detection can be useful. But I do think on this one Tesla would have been better off leaving the rain sensor out only after the NNs work well.

One of the biggest reasons I think so is I’m not sure anyone has proven yet camera-based weather sensing can be made reliably in the dark. If nothing else, as an active IR light source, a dedicated rain sensor would give the NNs a redundant data point in the darkness.

Yeah, I almost waxed philosophically on the merits of adding a pulsed IR emitter to back illuminate water or other deposits. However, that would then potentially blind the camera during the LED on time, interrupt the monotonic frame rate, and possibly mess up the camera AGC. Then, I realized a rain sensor that covered the area in front of the cameras could have the same effect in terms of back scatter.

With the night issue, if there are not photons coming in to illuminate the droplets, does the distortion of the droplets matter? What level of distortion can the NN process through?

I'm thinking the issue is that the 3 was not build to prioritize human UI experience, but rather with the end goal of FSD. For a FSD car, only the small section of windshield in front of the cameras matters. However, driver's care about the exact opposite region of the windshield. So, in the case of light mist that that camera heaters dissipate, the human is unhappy, but the NN doesn't care. Along with this is the difference in perception regarding what level of clean the windshield needs.

This decision on the UI interface unfortunately did not align with the capability developement of the NN (nor can the NN match human desire due to the varied conditions at the glass). My truck does not have auto wipers, our SUV does, but I still use the stalk, so not having that option would be annoying to me. At this point, it seems like a decent recovery step would be to add multi-click commands to the wiper switch. It would not fix auto wiper, but would allow quick access to different wiper modes beyond single wipe without needing to interact with the center screen.
 
Elon talking FSD about 22min into here:

Seems they are working a lot on how to drive with incorrect lane markings and curbs...

Thanks. Interesting that Elon says that Tesla is really working on the NN to see curbs and recognize "driveable space" in like residential areas. Seems like Tesla is working on the city driving part of FSD.
 
@diplomat33 Quite the contrary, it seems problematic (or is it emblematic) to me.

Instead of working on the city driving part, Tesla is still working on perception.

Mind you I’m not saying Tesla doesn’t have any driving code as well. But you need super reliable perception to really start honing and building your driving policy which is a big problem in itself too.