Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Coast to coast drive happening this year for all FSD Teslas!

This site may earn commission on affiliate links.
Yes, exactly! We have differing things we each would include at the micro level. I can see the macro features being very useful for marketing. Can we have a data sheet of what actually is in the delivery though? Seems like every release there are improvements but we don't know what they are and folks have to decipher them through reverse engineering.

We know these descriptions of improvements should be available. Otherwise how does management know that their developers are actually doing something? I sure would appreciate granular detail in some document that I could access as an owner describing what they did. Something like "AP will now identify the primary car in white and secondary cars in grey with noticed cars in shadow relief, secondary cars are defined to be ones that could become primary" or "AP now detects the primary car leaving the current travelling lane to avoid phantom braking". There are so many.

This kind of granularity would also be helpful to Tesla. Instead of saying something like "we improved the autowipers", and getting responses like "no you didn't". If they told use "we changed autowipers to detect mist and in that situation wipe once per 5 seconds" they would get feedback like 5 seconds is too long or too short. Folks who don't want the granularity can skip it all. But me thinks that this group wants it. Just look at the disappointment voiced each time release notes are too macro to be useful.

Totally agree!!! I too would love much more detail in the release notes. I sometimes think that the regressions owners see in Autopilot is the result of Tesla making a change for a good reason but the change having unintended consequences. And since owners are never told what the specific change was or what the reason was for the change, we have no context to understand the regression or perceived regression.
 
If they told use "we changed autowipers to detect mist and in that situation wipe once per 5 seconds" they would get feedback like 5 seconds is too long or too short.

I 100% agree. I got tired of auto wipers not activating soon enough that I sent a voice bug report 4 or 5 times in one week, to specifically say what was happening and what I expected. Then there was no rain for a couple of weeks.
Then yesterday I went in to a store and when I came out there was a very light mist...something where you'd want a wipe every 5 or 10 seconds. When I put the car in Drive, the wipers came on high speed and wiped 10 or 20 times before I couldn't stand it and set them to manual. Yes, it appears the car is toying with me.
 
  • Informative
  • Funny
Reactions: turnem and Inside
I 100% agree. I got tired of auto wipers not activating soon enough that I sent a voice bug report 4 or 5 times in one week, to specifically say what was happening and what I expected. Then there was no rain for a couple of weeks.
Then yesterday I went in to a store and when I came out there was a very light mist...something where you'd want a wipe every 5 or 10 seconds. When I put the car in Drive, the wipers came on high speed and wiped 10 or 20 times before I couldn't stand it and set them to manual. Yes, it appears the car is toying with me.

Not to revive a dead thread, but this is a classic case of Elon realizing something after he's committed to the opposite. During autonomy day, he even spoke about not using neural nets for certain things because it's overkill.

One of those things is auto wipers, Elon! Just retrofit a rain sensor in our damn cars. I'll even pay $50 for it.
 
Not to revive a dead thread, but this is a classic case of Elon realizing something after he's committed to the opposite. During autonomy day, he even spoke about not using neural nets for certain things because it's overkill.

One of those things is auto wipers, Elon! Just retrofit a rain sensor in our damn cars. I'll even pay $50 for it.

Tesls us skating to the where the puck/ end goal is. Normal rain sensors work for averaged rain distribution to support human based visualization needs. Autopilot needs to recognize that any of the 3 forward cameras are occluded and take action. Rain sensor will not help when there is a dead bug, bird byproduct, salt film in front of a camera. Once the rain/ occlusion detection is functional from a vision system, all the money spent on rain sensors is a waste.
(The one difficult edge case being a really light mist due to camera self heating)
 
Wow.. some people go to great lengths ti justify Tesla is doing it right.

In the last 15 years since I have been driving cars that had this tech, there was not even one time it missed to do its job correctly.
Huh? I'm not claiming it works right currently. I am saying that AP requires more than a standard rain sensor.
Are you claiming that you have never had to manually operate your wipers?
 
Hmmm!? Its says "USA Coast 2 Coast {something}"

D8eaarPUEAA99Ja.jpg:large
 
Tesls us skating to the where the puck/ end goal is. Normal rain sensors work for averaged rain distribution to support human based visualization needs. Autopilot needs to recognize that any of the 3 forward cameras are occluded and take action. Rain sensor will not help when there is a dead bug, bird byproduct, salt film in front of a camera. Once the rain/ occlusion detection is functional from a vision system, all the money spent on rain sensors is a waste.
(The one difficult edge case being a really light mist due to camera self heating)


Meanwhile, back here on earth where a human is operating the vehicle and the AP camera can only see out of a tiny fraction of the windscreen, a rain sensor offers the best utility with the least complexity. I don't really care what the camera sees when I can't see anything at all.

And this is to say nothing of all of the things that the NN is going to need training for just to make it passable for people to be able to see out the window.
 
Considering how long the NN rain sensing was missing and how poor the NN rain sensing is still... with first AP2 cars are reaching the end of natural ”new car” lives this year (lease periods etc) it is clear they would have been better served by a rain sensor. It would also have given Tesla some redundancy on rain sensing down the road in tricky lighting situations.

If Tesla indeed was skating to where the puck is (I’m not sure if it wasn’t simply misguided cost-savings hubris from Elon’s part), they failed.
 
Last edited:
  • Like
Reactions: DrDabbles
Meanwhile, back here on earth where a human is operating the vehicle and the AP camera can only see out of a tiny fraction of the windscreen, a rain sensor offers the best utility with the least complexity. I don't really care what the camera sees when I can't see anything at all.

And this is to say nothing of all of the things that the NN is going to need training for just to make it passable for people to be able to see out the window.
The human can turn on the wipers themself. Wipers are a requirement, auto wipers are a luxury. The NN doesn't need to care if the rest of the windshield is clear, nor do rain sensors have the ability to.

Considering how long the NN rain sensing was missing and how poor the NN rain sensing is still... with first AP2 cars are reaching the end of natural ”new car” lives this year (lease periods etc) it is clear they would have been better served by a rain sensor. It would also have given Tesla some redundancy on rain sensing down the road in tricky lighting situations.

If Tesla indeed was skating to where the puck is (I’m not sure if it wasn’t simply misguided cost-savings hubris from Elon’s part), they failed.

It is not redundancy if it does not sense the region in front of the three forward cameras. Not does it provide any sensing of the other 5 cameras. Camera occlusion detection is a requirement for safe FSD, auto-wipers for the human is not.

How can you say Tesla has failed? All of the AP software is still a work in progress with continually improving feature set.

Aside from the fact that it barely works, using a NN wastes a lot of energy.
The NN is running regardless. The additional neurons for rain sense are not an appreciable amount of energy.
 
  • Disagree
Reactions: DrDabbles
It is not redundancy if it does not sense the region in front of the three forward cameras. Not does it provide any sensing of the other 5 cameras. Camera occlusion detection is a requirement for safe FSD, auto-wipers for the human is not.

How can you say Tesla has failed? All of the AP software is still a work in progress with continually improving feature set.

Tesla themselves explained on Autonomy Investor Day how they use radar and ultrasonics for limited redundancy. The rain sensor could be used in similar fashion to get another input into the weather equation — to confirm or not what the NN may be seeing.

As for Tesla failing, my point is that in the period of ownership for most AP2 cars — at least the first owner — it seems a rain sensor would have in the end served us better than Tesla’s NN aspirations.

In this sense Tesla failed. Maybe Tesla is still skating towards the goal where the puck is eventually supposed to go, but they missed the pass on the midfield...
 
Last edited:
  • Helpful
  • Like
Reactions: DrDabbles and mongo
I don’t think this is true so far. The weather sensing has its own NN doesn’t it? @verygreen

The additional neurons could a a separate NN, but that is only a logical distinction. My point being the incremental power usage is a non-factor.

Tesla themselves explained on Autonomy Investor Day how they use radar and ultrasonics for limited redundancy. The rain sensor could be used in similar fashion to get another input into the weather equation — to confirm or not what the NN may be seeing.

As for Tesla failing, my point is that in the period of ownership for most AP2 cars — at least the first owner — it seems a rain sensor would have in the end served us better than Tesla’s NN aspirations.

In this sense Tesla failed. Maybe Tesla is still skating towards the goal where the puck is eventually supposed to go, but they missed the pass on the midfield...

Yah, for those that wanted auto-wipers, the plan is a fail. A rain sensor may have helped train the NN, but there is still the problem of what area the sensor sees vs the cameras.
 
During the shareholder meeting yesterday, Elon said that he is running the latest dev software on his car and it is able to self-drive him from his home to the main office, but there a couple disengagements from time to time, especially at intersections. So I would say the coast to coast demo is probably not quite ready yet. Luckily, Tesla still has 6 months before their self-imposed deadline of the end of the year to get it right.
 
During the shareholder meeting yesterday, Elon said that he is running the latest dev software on his car and it is able to self-drive him from his home to the main office, but there a couple disengagements from time to time, especially at intersections. So I would say the coast to coast demo is probably not quite ready yet. Luckily, Tesla still has 6 months before their self-imposed deadline of the end of the year to get it right.

I know your glasses are rose colored but their self-imposed deadlines have come and gone several times. This is just one of their latest. Don't hold your breath and I do hope you have a tasty hat to eat when this one comes and goes.....
 
I know your glasses are rose colored but their self-imposed deadlines have come and gone several times. This is just one of their latest. Don't hold your breath and I do hope you have a tasty hat to eat when this one comes and goes.....

Yes, I am aware of their missed deadlines in the past. I did not say that Tesla would do a coast to coast this year. In fact, I write that a coast to coast demo is not ready yet. I merely stated that they have 6 months left if they want to meet their deadline.