Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

HW2.5 capabilities

This site may earn commission on affiliate links.
I guess I'll post here too.
The latest snapshotting logic shows us that they actually have some quite advanced labels.
A snapshot trigger for my car requesting "label: construction" triggered for my car today and created this image:

aQcVLmM.jpg


There's a bunch of other labels requested like "Confusing lanes", "slope up/down", "barrier" and so on.

I guess I have an urgent need to drive around in strange places to see what else I can trigger ;)
Please drive on a BMX course to see if "sick jump ahead" is tagged.
 
Ok, so now that I know the snapshots for my car stopped because the time for them was limited to just 23 hours, I rolled the time back and drove around some more.

Only managed to trigger a few slope up/down. The confusing part is oftentimes slope up would have a slope down in it too so I am not sure how NN decided if it's a slope up or down it's facing (because it's both).

Slope Down:
3144427277024.img.data.jpg

3213424383584.img.data.jpg


Slope up:
240550390400.img.data.jpg

2544536570176.img.data.jpg

2808941783936.img.data.jpg

2943685956320.img.data.jpg

3146927338272.img.data.jpg
 
So I'll keep trying to trigger the other snapshots. I also found it interesting that the "random" snapshots were activated very frequently, I wonder if that's because they were hoping to catch "false negatives" with it? I.e. if you have a random snapshot showing something (detected by some other means?), but no immediate/preceeding label snapshot you know you have a false negative? No, that cannot make sense, they can feed every random image into the same NN as what's running on the car and know that way. Weird.
 
Not being pedantic, but do we actually know that 3 is Model 3? It's an int field, so previous options would likely have to be 0, 1, and 2.... (not S and X)
We don't really know that until we get inside a model 3, I guess.

I looked further and there are several options.
If it's taken from VehicleGeometryType then 3 is VEHICLE_GEOMETRY_MODEL_X_LHD (matches my car and makes sense I guess).
But if it's taken from ChassisType then 3 is CHASSIS_MODEL_3.

The trigger also is only supposed to hit within 2000m from last park, first FCW was certainly not that far away, but the second I am not so sure. I guess I need to dig deeper into how is that vehicle_model populated.

Edit: Ok, that was easy:
Code:
struct MiscTriggerFields {
...
    ChassisType vehicle_model;
...
}

In a way I guess that makes sense since if they weer doing model prefiltering server-side, why would they need to check it again on the car?
 
Last edited:
The random snapshots do seem to be a search for false negatives. Have a human find the false negatives from the photos. I'm not sure how else they could find the edge cases. Although the advantage Tesla has is the reaction of the human driver to false negatives and false positives in the driving path.

This is interesting , as Musk claimed that they would not have to do the manual work that mobileye has done to get an accurate system.

I wonder how close Tesla will get to a fully functional system before they add Lidar.
 
So Tesla is still doing manual labelling?

Well, something needs to label the unlabled photos that should have a label. I'm sure Tesla uses, or tried to use, the human driver to do some of that work. So that would not be "manual". But I suspect paying a bounty to humans for searching photos to find edge cases is attractive to the developers.

I can see why everyone quits. Trying to rush this stuff to meet Musk's absurd claims must be extremely unfun.
 
Well this was disturbing: Perhaps the first rainstorm I’ve driven in since purchasing my Tesla (P75D, AP2.0, 2017.42). Lane keeping on surface streets was fine, even with really unclear (to my eye) lane lines. Didn’t bother the car a bit. But I noticed the console’s representation of the car in front of me kept disappearing momentarily on a fairly regular basis.

Then I realized the target would disappear right after a windshield wiper pass over the camera. The reaction was slightly delayed, perhaps .1 or .2 secs after the sweep. And it was quite consistent, on the order of 80% of the sweeps.

The vehicle target was a mid-sized work truck, painted blue, pulling an orange, wheeled liquid tank of some sort.

It didn’t seem to affect the car’s ability to maintain a safe following distance (set at 3). But it does seem that at least in this release, the target solution is affected momentarily by the windshield wiper.
 
Well this was disturbing: Perhaps the first rainstorm I’ve driven in since purchasing my Tesla (P75D, AP2.0, 2017.42). Lane keeping on surface streets was fine, even with really unclear (to my eye) lane lines. Didn’t bother the car a bit. But I noticed the console’s representation of the car in front of me kept disappearing momentarily on a fairly regular basis.

Then I realized the target would disappear right after a windshield wiper pass over the camera. The reaction was slightly delayed, perhaps .1 or .2 secs after the sweep. And it was quite consistent, on the order of 80% of the sweeps.

The vehicle target was a mid-sized work truck, painted blue, pulling an orange, wheeled liquid tank of some sort.

It didn’t seem to affect the car’s ability to maintain a safe following distance (set at 3). But it does seem that at least in this release, the target solution is affected momentarily by the windshield wiper.

This was exactly my experience as well.

Possible scientific vindication of using just cameras, no lidar
 
Then I realized the target would disappear right after a windshield wiper pass over the camera. The reaction was slightly delayed, perhaps .1 or .2 secs after the sweep. And it was quite consistent, on the order of 80% of the sweeps.

Interesting.

So obviously they need to implement frame synchronization with the wipers, much like early warplanes firing the forward machine gun through the prop.

Or just stop the image of the lead vehicle on the dash glitching so as not to disconcert the human at the wheel :)

Kinda fun to consider that Tesla vision needs to be able to handle blinking.
 
Last edited: