There's been some discussion about the inward-facing camera that Tesla introduced on cars after the introduction of AP2.5. We have an AP2 Model X and and non-AP2.5 Model 3 (but it has the hardware obviously).
We know that Tesla had hired a lot of the Augmented Reality specialists from some other companies, and particularly Skully, the HUD helmet startup that failed. I'm wondering if the camera inside the car is also going to be used to track head position in order to project AR on a HUD accurately to the environment. Due to the change in perspective when you move your head, it seems problematic to accurately overlay AR onto the windshield to match up to the outside environment. Your view of the world is very different if you're leaning against the window sill or leaning on the arm rest. Plus peoples' heads are obviously just at different heights naturally.
The only way to even hope to align an AR HUD accurately would be to understand where the viewer's head is positioned.
We know that Tesla had hired a lot of the Augmented Reality specialists from some other companies, and particularly Skully, the HUD helmet startup that failed. I'm wondering if the camera inside the car is also going to be used to track head position in order to project AR on a HUD accurately to the environment. Due to the change in perspective when you move your head, it seems problematic to accurately overlay AR onto the windshield to match up to the outside environment. Your view of the world is very different if you're leaning against the window sill or leaning on the arm rest. Plus peoples' heads are obviously just at different heights naturally.
The only way to even hope to align an AR HUD accurately would be to understand where the viewer's head is positioned.