Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Seeing the world in autopilot, part deux

This site may earn commission on affiliate links.
How did Mobileye spot this pedestrian before it could see them? Suggests to me that there is enough of a time lag between the raw video feed and the classification that they have had to frig the boxing overlay to make it appear like detection is happening sooner than it really is.

Very interesting. It seems to me a proof that Mobileye is cheating, at least in that video. I don't see any other explanation. Anyone have another view about this?
 
As usual, thank you @verygreen !

It sucks Tesla feels they need to waste time trying to “protect” things from people like you and in your circle of tinkerers and I would rather they just their time doing something more productive. Hope it doesn’t create an annoying situation for you and others, but, as always, we all appreciate what you share!

Mild push back:
Why are people complaing that lane change is not fully automatic and requires driver confirmation? Is it due to false expectations generated by people who made the software do something it was not yet intended to?
Instead of going from all driver commanded lane changes to AP suggested/ driver confirmed, people now feel like they are going from fully automatic to driver confirmed.

What is a feature increase is turned into a feature decrease.

Expectation management.
 
  • Disagree
Reactions: alcibiades
@Bladerskb @Joel @J1mbo I felt bad about helping to derail verygreen’s thread but I feel a bit better now that I’m not the only one. I created a megathread where we can move Mobileye vs. Tesla discussions when they crop up in other, unrelated threads:

Mobileye vs. Tesla megathread

Sometimes unrelated threads do create a genuine jumping off point for the Mobileye vs. Tesla discussion, but that discussion should be continued elsewhere to prevent people like poor verygreen from losing their threads to off-topic debates.
 
Recognized a Ped(estrian) when they were close even with a cart at 3/4 angle it seems. Also isn't green the drivable freespace and it seemed early that that ped/cart was excluded from that area.
Well, the problem is the actual pedestrian recognition was kind of late. imagine that the pedestrian was about to cross the street, would you be able to stop in time at the distance it was detected as such?
 
  • Helpful
Reactions: scottf200
Well, the problem is the actual pedestrian recognition was kind of late. imagine that the pedestrian was about to cross the street, would you be able to stop in time at the distance it was detected as such?

If the non-drivable area tracked the person with cart, the car would avoid them or stop. However, it would not do the preemptive "person who is waiting to cross at cross walk" local to me law though. The orientation of a person crossing the street with cart would also be different that this situation (90 rotation).
 
@verygreen It seems to be an ongoing controversy whether without lidar HW2 Teslas are capable of estimating depth accurately. Would you or anyone you know be able to run a proper trial of Autopilot’s depth estimation metadata against some ground truth (like a long measuring tape or a laser distance measurer)? Ideally including as one of the tests from a distance as far away as you can accurately measure with your ground truth instrument.

I think if Autopilot always comes in at within 20 cm or 8 inches of ground truth, that’s hands-down a win for Tesla. A bit worse would probably be fine, but humans only seem to manage about 20 cm of accuracy longitudinally when parking.

Humans do better on lateral accuracy: about 10 cm or 4 inches. Does the Autopilot metadata provide a distance estimate for cars directly adjacent? If so, that would allow testing of Autopilot’s lateral accuracy as well.

Edit: @jnuyens maybe you could conduct this trial?
 
Last edited:
I thought maybe you had access to the same metadata as verygreen. Their videos show distances estimates down to 0.1 m / 10 cm.

If you read the article, you'd know it's a special dev board he discovered.

"They have now teamed up again for an even better look at what Autopilot can see after they bought an Autopilot Hardware 2.5 computer on Ebay that just happened to be a fully unlocked developer version."

Watch a Tesla drive in Paris through the eyes of Autopilot
 
How do you have v9? Are you in the early access program? I don’t really understand how people are getting v9, who has developer permissions, or whatnot. It is all very mysterious to me, even after reading the article.

Your confusing the two. The article is based on a developer ape board @verygreen found. V9 has absolutely nothing to do with that. Hope that clears it up for you.

Anyone that has v9 that's talking about it, didn't get it from Tesla directly, otherwise they'd be under NDA.
 
I'm interested in this super special blacklist. Seems to be a violation of us law. Shady Tesla, violating ownerso rights to root through copyright exceptions.

But it's like fight club @croman, Tesla will never admit that it exists. Even when you go to the SC, they'll just say your car hasn't been slotted for a firmware that's compatible with your car and packages yet. Even though we know all firmwares are 100% generic.
 
I ran across this thread just now. Very interesting stuff, for sure. I have not read through all 15 pages of comments (!!), but wanted to throw this out there...

In the OP I read "a pedestrian in red(dish?) jacket is not detected at all" (with the recommendation to maybe NOT wear red jackets in Cali and Norway...! hahahaha). I looked at that few seconds over and over and noted the car was turning (it looked like maybe it was even a u-turn). As I observed this short section over and over again, I saw that there is ANOTHER pedestrian, this one in black/gray, at about 3:34, just seconds after the redcoat, this one walking in the crosswalk, and even closer (it appears) to the car. Most importantly, it appears that ped ALSO was not recognized (I saw no bounding box around either the redcoat or this second person). Because the ped was in the crosswalk, and dressed in dark color, I think this needs to be pointed out as I find it even more distressing. Perhaps as a point of discussion or thought.

Just thought I'd mention it.

- shud
 
So I have another old video for you. This one is interesting in that autopilot is identifying and tracking roadside structures like poles. There are also other unknown points mostly in the upper part of the picture, these are not radar, but purely vision-generated points, note how they often track cloud outlines in the sky, not just traffic signs and whatnot. I guess this is mostly to be used with HD maps for better localization.

I think this shows that @jimmy_d analysis of old NNs was correct and there's a lot more of hidden not yet activated stuff there.


There are two kinds of these things. Just anonymous point cloud and then there are points with an object type - you can tell them apart because latter ones have a number nearby denoting the type. We do not have a coherent picture of what each value represent.

Also for whatever reason v9 and a even later v8.1 builds suddenly turned this detection off for some unknown reason, so it's entirely possible the code was experimental and did not live up to expectations?
 
Last edited: