Citing Elon's tweets as a source of facts is rather amusing...
Whereas listening to some third party on a forum with no link to the actual development is somehow a better choice?
The fact that Elon is talking about the rear-facing camera in the 3 being for autonomous ride sharing tells me that this whole thing is about juicing sales and the stock price. I mean, seriously, who believes that these things will ever be robo-taxis? And by "these things" I specifically mean AP2 cars produced since Dec 2016, not some future product.
Hi! Right here. With the clarification that only Model 3s (the car that was released after the Tesla Network announcement) currently have (with HW3) the HW needed (increased redundancy and internal cameras) for public autonomous cab service.
Monocular depth estimation where you can see the entire object and see it in context is one thing. Depth estimation of a vehicle in the side cameras in a nearby lane -- where you may not see the whole vehicle and can see very little context -- is very different.
Well then, it's a good thing the side cameras can see the whole object (or at least enough pavement to determine how far away it is). Note that even a lane splitting motorcycle is fully in view...
From Here's what Tesla Dashcam's side cameras are able to see and record on video
I know that the current software is incapable of doing this because literally every day during my commute TACC brakes for vehicles in the neighboring lane which are not even making any indication that they might come into my lane and in fact are not particularly close to the lane line.
Current software not doing something does not mean it is not achievable with the current sensing hardware...
I would guess sentry mode is an offshoot of the neighboring vehicle/ pedestrian problem and is being used to refine that detection...