Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Virtual overhead view and FSD

This site may earn commission on affiliate links.
Just my $0.02 worth, I was thinking about why Elon was talking about tying the long-awaited virtual overhead view (I've wanted that from day one after I saw it elsewhere) with FSD. In the S, the doors are very high and it's hard to see the parking spot lines.

As many have noted, the vehicles (all of them) lack sufficient camera coverage to provide a real-time snaphot of an overhead view. However, since the new rollout of FSD involves the "4th dimension" time - HOWEVER... what if you add time to the problem?

As you are driving into a parking space, the FSD stuff is capturing data over time - and some of that could be scanning areas which are later in the camera "blind spots" - but weren't a few seconds ago. If the FSD computer remembers enough, it can synthesize the view.

I bet this also nicely gets around any other patents on the subject, as Nissan/BMW/etc don't let others use their patents for free.

Thoughts?
 
  • Like
Reactions: Transformer
My only concern would be an animal or child entering the frontal area after the car has "seen" it. I suppose the ultrasonics would catch the discrepancy.... I hope!
The rewrite is called 4D and the 4th dimension is time.
It does not "lose" what happened in the frame before or 5 frames before it takes all of that into account.

i.e. there is no "once it has seen" problem because it always keeps seeing.
 
But it cannot see the area close to the front of the front bumper, because the camera is mounted high in the windshield. So if you are driving forward slowly, as in parking, it would be possible for something to move into that area from the side without the camera catching it. As I said, hopefully the ultrasonics catch it.

I prefer backing in (what a great rear camera!), but I'm really looking forward to the overhead view. Pun intended!
 
But it cannot see the area close to the front of the front bumper, because the camera is mounted high in the windshield. So if you are driving forward slowly, as in parking, it would be possible for something to move into that area from the side without the camera catching it. As I said, hopefully the ultrasonics catch it.
They've had the ultrasonics drawing the line on Tesla's since ~2014 as you approach obstacles in close proximity and the measurements are fairly accurate down to an inch.
 
They're in the top of the windshield, and cannot see directly in front of the car for several feet. And the ultrasonic sensors cannot "see" low objects, like curbs. I'm looking forward to the overhead view, but I don't expect it to be as good as cars that have dedicated cameras in the grille or front bumper.
 
  • Disagree
Reactions: mikes_fsd
I've owned 3 cars previously that were equipped with 360 view. All of them had cameras mounted under the side mirrors pointed straight down to get the surround view. Maybe Tesla has something magical up it's sleeve. Sure hope so because I miss it.
 
Possible they might display something like this and create a 3D vector space representation of the world (start at 1:40 – keep in mind this was created using only one camera and one radar):

Considering Tesla's have 8 cameras + ultrasonics and radar it seems likely this would something they're working on. Would be really impressive if this is the dirdseye view Elon says we'll be getting with the FSD update (V11 I'm assuming).
 
Possible they might display something like this and create a 3D vector space representation of the world (start at 1:40 – keep in mind this was created using only one camera and one radar):

Considering Tesla's have 8 cameras + ultrasonics and radar it seems likely this would something they're working on. Would be really impressive if this is the dirdseye view Elon says we'll be getting with the FSD update (V11 I'm assuming).

If they can map labeled objects from the cameras and radar into 3D space then the Birdseye view (or any view for that matter) is possible by just moving the virtual camera around in 3D space. Of course it’s limited by what the cameras can see.

I haven’t seen any evidence of object permanence in the NN. If an object disappears behind a car or a tree it disappears altogether.
 
They've had the ultrasonics drawing the line on Tesla's since ~2014 as you approach obstacles in close proximity and the measurements are fairly accurate down to an inch.

Except for when you are within 10 cm from an object and the MCU shows STOP. In Europe that means you still need to progress for parking :). Seriously, there should have been a camera in the front bumper. Sonar is useless for parking in really tight spaces.

Having a birdseyeview isn't any useful as it cannot add detail that is lacking from the sensors to begin with. Its fine for 10cm accuracy but not for cm accuracy.