Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Red light detection, stop sign detection, stopping!

This site may earn commission on affiliate links.
When you say map, you mean the one that is displayed and does it show in the 'satellite' view or just the non-satellite view. I've never seen even grayed out signals or signs.
when I say maps, I mean the maps the car uses. The displayed maps are just flat tiles/vectors from google, the car does not use them, they are for you.

You have not seen grayed out signals because this feature is not enabled by default. Also since you are on a model3, which does not allow you to play with those things in an easy way.
 
traffic lights/stop lights are marked on the map. if it's not mapped, no detection is attempted. I hae a couple of unmapped lights that I noticed around. One installed a couple months ago, one I am not sure why it's unmapped. These are never detected. The mapped ones - when you are in vicinity, the car shows the gray icon and then once it's seen and signal is know, the signal state is added. Similarly with the stop signs.

They had v2 maps that had per lane information and stuff and it looks like they now have v3 maps with even more stuff, but I did not look into it with any sort of details, since it seems to be mostly in CA only anyway. Most of the mappign data hey use is in the form of v1 adas tiles we looked at in the past and the teslamaps data (I think this is where the traffic lights info is stored today, no detection works if you do not have teslamaps of the right vintage or region)

Thank you. I appreciate it.

To clarify, you believe in the ADAS maps traffic light info is stored, and these ADAS maps are on the majority of major roads in US. And is the traffic light info just a boolean for the whole intersection, or is it per direction approaching the intersection? And so if the ADAS map does contain info that there is a traffic light.... then the HMI will show the traffic light as you approach, and once you get close enough it will show what color/status of the light. And if it's not detecting the light, but it is marked in the map, it will approach the intersection slowly until it sees a green? (All of this assuming you have stopping at lights enabled)

Then a while back you discovered v2 maps... which I assume what was a lot more precise information and a lot more layers? And contained per lane information? Was per lane information only for highways, or also urban roads? And these v2 maps were only for state of California? And recently you found v3 maps also only in CA? any idea what the v3 maps had that the v2 maps did not have? I recall previously you mentioned stop line information in the maps, are those in the v2 maps or in the ADAS maps. Also the visual landmarks/road side objects/furniture, are those in the Adas maps or the v2 maps?
 
And is the traffic light info just a boolean for the whole intersection, or is it per direction approaching the intersection
I did not really look in details how this is encoded, but as you approach an area with a known traffic light or a stop sign, a gray image of it appears.

then the HMI will show the traffic light as you approach, and once you get close enough it will show what color/status of the light. And if it's not detecting the light, but it is marked in the map, it will approach the intersection slowly until it sees a green? (All of this assuming you have stopping at lights enabled)
Yes. If the light/stop signal cannot be read for whatever reason (e.g. it is actually missing) - the car will stop at the intersection (And it is gradually slowing down as it approaches until signal state is confirmed).
On the other hand if the maps are not marked - the car will not stop no matter the signal/sign presence.

The v2 and v3 maps are a lot more detailed, but they mostly appear to be populated in California, I never got much data from these in TN.

The v2 map structure was visible in the code very clearly, it's more obscured in v3.
 
I did not really look in details how this is encoded, but as you approach an area with a known traffic light or a stop sign, a gray image of it appears.


Yes. If the light/stop signal cannot be read for whatever reason (e.g. it is actually missing) - the car will stop at the intersection (And it is gradually slowing down as it approaches until signal state is confirmed).
On the other hand if the maps are not marked - the car will not stop no matter the signal/sign presence.

The v2 and v3 maps are a lot more detailed, but they mostly appear to be populated in California, I never got much data from these in TN.

The v2 map structure was visible in the code very clearly, it's more obscured in v3.
So then, from your experience, what is the lead time from some feature showing up in something you uncover and it being in a public release, beta or GA? Also, I'm not sure how to interpret what you replied to my previous with. In my 19.8.3 release notes it references stop light detection. You made reference to something needing to be turned on. I assumed you meant in the code you have running where options are selectible. I dbl checked, I have no options related to stop light. So, does that leave the lights I've tested against are not encoded? Thanks.
 
The v2 and v3 maps are a lot more detailed, but they mostly appear to be populated in California, I never got much data from these in TN.

The v2 map structure was visible in the code very clearly, it's more obscured in v3.

Ohh.. So the maps are not strictly contained within CA borders?That makes more sense.

So perhaps these maps are created from data consumer fleet vehicles? And once there is enough data from several cars they can turn it into the map and populate the v2/v3 maps?
 
So then, from your experience, what is the lead time from some feature showing up in something you uncover and it being in a public release, beta or GA
It really depends.
But for NoA ULC Elon just tweeted it's going wide, from 18.39 (mid-September?) to now basically.

In my 19.8.3 release notes it references stop light detection. You made reference to something needing to be turned on
It's a config variable tesla turns on, it is reflected in in your car within 24 hours from that moment. But for this particular feature there's no user-facing stuff to see if it's on or not other than once it triggers.

Is this proof of why things like NoA seem to work better in Cali?
Yes. It's a well known thing, Tesla's AP works best in CA.

So perhaps these maps are created from data consumer fleet vehicles? And once there is enough data from several cars they can turn it into the map and populate the v2/v3 maps?
I don't really have visibility into how they create this data. It's pretty clear they collect fleed data from regular cars for the v1 adas maps.
 
Anecdotally sure, but never any firm evidence as to why.

It's not like there aren't thousands of Tesla's near me driving over the same roads that NoA does terrible on. We're talking about the most major freeways in the state that it's getting wrong.

Hopefully we'll learn more about it on the 19th.
have you heard about "everyone scratches their own itch"? Well, there are lots of people at Tesla that live in CA and scratch their own itches there....
 
  • Funny
Reactions: wcorey
It really depends.
But for NoA ULC Elon just tweeted it's going wide, from 18.39 (mid-September?) to now basically.


It's a config variable tesla turns on, it is reflected in in your car within 24 hours from that moment. But for this particular feature there's no user-facing stuff to see if it's on or not other than once it triggers.


Yes. It's a well known thing, Tesla's AP works best in CA.


I don't really have visibility into how they create this data. It's pretty clear they collect fleed data from regular cars for the V1 adas maps.
I have a more basic question. Does HW2.5 learn in real-time? For any given route, does a car that's driven it a couple of times navigate it better than a car never having traversed it? I think yes but am unsure if that's confirmation bias.
 
Unequivocally no. There is no active learning. See @verygreen's many postings on this.
My attempt there was to solicit an answer from da man himself (I still don't understand the desire for total anonymity on here and hopefully @verygreen is a he). But to answer you.
1) I tried going back to conversations I've had on the garage wiring for a potential Wall Charger. They are gone. I do not know what the purge policy is but my post history is very limited on here. Actually, so is verygreen's as I tried exactly what you suggested.
2) There have, even in the past day or two, been completely erroneous if not environment specific posts that were made as universally applicable.
I am sure over the history of @verygreen's comments on NN they have spanned HW1, HW2, HW2.5 and HW3. My question was very specific to HW2.5.
And certainly I don't mean to imply you are wrong, you may well be right. But, again, on the NEMA 14-50 on a 30a circuit you can see people, clearly intelligent, making definitive stmts that were immediately disputed by other intelligent people. So, don't take this as a slight, it is not.
 
Last edited:
@verygreen

So NoA must only be using the ADAS tiles, for knowledge of all the lanes, and getting in the right lane, taking interchanges and all that, is it using Lane Data from Adas Tiles?

Is the ADAS tiles lane model only used for high level routing planning, or is it used to improve lane centering, like when camera detection fails momentarily does it use the lane data to stay on course? and on tight interchanges does it use the lane data from the ADAS tiles to help with steering around the curve accurately?

Then I am wondering are they only using GPS/IMU to localize themselves to the lane model in the ADAS tiles? Or do they also use lane detection and use that information to enhance their localized position with respect to the map laterally? And I assume they are not yet using those vertical line landmarks to enhance localization position?

When in CA, say where there is v2/v3 maps? do you know if NoA is using these maps to enhance lateral/longitudinal control of AP?

Thanks
 
I have a more basic question. Does HW2.5 learn in real-time? For any given route, does a car that's driven it a couple of times navigate it better than a car never having traversed it? I think yes but am unsure if that's confirmation bias.
No, there's nothing like that at all. Until a route is updated in the maps all cars drive it the same, after - all car drive it the same, just differently. The updates are far from realtime too.

(I still don't understand the desire for total anonymity on here and hopefully @verygreen is a he)
Judging by my video stats from youtube, you are fair to assume you are talking to all males on these forums ;)
stats.png