Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon confirmed addressing my biggest Tesla software deficiency: Tight Parking/360 view

This site may earn commission on affiliate links.
It’s effective resolution is not 10 inches. That’s just the closest an object can be for it to consistently register. It absolutely can distinguish between, for example, 11 and 12 inches.
Sorry, not being very careful with my words.... Below 10" you just get told "stop" so can't tell between 6", 7", 8"etc...

Of course you'd be sensible to stop at 12", but if it can reliably indicate 11" vs 12" it would be nice it it could keep going down to say 7".

When I typed 'effective resolution 10", I was saying that from the perspective that I don't care about 20", 15" or 12" other than as a rough guide that my eyes can often tell equally well.

It's up close that it makes most difference.
 
Presumably the ultrasonics will pick up said suicidal, unattended dog. Sure, you won’t get a picture of the blur as the dog runs past a camera, but some other warning could suffice in that case.
Actually, untrasonics will not even pick up a large duffel-bag in front of the car, so probably not the dog either. The issue here by the way is less that ultrasonics will not pick it up, but that the camera would show an all-clear to the driver, which is worse. It's like if the car had a feature to tell you whether the traffic light is red or green, but whenever it can't figure it out it tells you it's green rather than it doesn't know.
 
I recently saw a video for a car (don't recall which one) but it not only had a traditional overhead 360 degree camera but also an alternate view where the car appeared on screen as if you were standing outside of it watching it from a distance.

I'm not even sure how that one was even possible.
 
360 degree "surround" image is synthesized by splicing multiple camera images (with appropriate splicing/cropping to have the resulting composite image show more-or-less full coverage. Its not magic, its PFM software with multiple overlapping sensors.
 
Actually, untrasonics will not even pick up a large duffel-bag in front of the car, so probably not the dog either. The issue here by the way is less that ultrasonics will not pick it up, but that the camera would show an all-clear to the driver, which is worse. It's like if the car had a feature to tell you whether the traffic light is red or green, but whenever it can't figure it out it tells you it's green rather than it doesn't know.

I live down a single track lane. One night on the way home an old local stray dog was wandering around. I know him and he's deaf and a bit daft. I pulled up and I only know he walked in front of the car because of the change in headlight reflection off the road. Nothing from the ultrasound and too close to be visible at the front bumper and not visible to the cameras (I checked).
 
  • Informative
Reactions: Cheburashka
I live down a single track lane. One night on the way home an old local stray dog was wandering around. I know him and he's deaf and a bit daft. I pulled up and I only know he walked in front of the car because of the change in headlight reflection off the road. Nothing from the ultrasound and too close to be visible at the front bumper and not visible to the cameras (I checked).
Yes, I can also confirm its the same case for chickens who literally like to "play chicken" on our property. Our Tesla would be perfectly happy to drive over them :O
 
It’s effective resolution is not 10 inches. That’s just the closest an object can be for it to consistently register. It absolutely can distinguish between, for example, 11 and 12 inches.

The problem is I regularly park in places with less than 10 inches of clearance and I wish the ultrasonic sensors at least went down to about 6 or 8 inches. Those extra 2-4 inches is when I need the sensors the most.
 

Here's the BMW 360.
It's not just BMW. Here is Audi:

and even non-premium brand Toyota:

and probably every other mainstream car manufacturer has similar features in their Tesla price range cars.

This isn't anything new outside of Tesla universe, where people get excited to have their car read text messages, something other cars did more than 10 years ago.
 
  • Informative
Reactions: PhilDavid
Regarding ultrasonic and minimum range - it's a principal problem of the transducers (the thing in the bumper) used in cars - it's used both as speaker and microphone. They generate the burst (speaker mode) and then need to wait for some time to stop "shaking" to be able to act as microphone (not to catch up those vibrations but the actual echo). And this time equals to 2x10 inches (there and back).

Regarding 360 view - I understand "original" TMS did not get it (they had another things to solve). But WTF it's not available (at least as optional package) in today's cars? Esp. as you can have it in "low cost" cars like Leaf at the end...
 
Nissan Leaf also has it. Which is way below Tesla's price range.

The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.
 
The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.

Elon promised tesla apps and an SDK. I wish they did offer that so people could make their own :)
 
Elon promised tesla apps and an SDK. I wish they did offer that so people could make their own :)

This would be a fairly complicated app to make though as you'd need instantaneous video feeds.

The thing is Tesla's FSD visualization will need to generate exactly what we are looking for when navigating tight parking garages with pillars, poles, other vehicles, etc.,

To some extent what we are asking for with 260 view is something the Tesla software will have to build or itself for FSD.
 
The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.
Yeah Elon. Just pretend your Rover is stuck on Mars and your software team have to get the cr@py old sensors to give 360 view. You can do it!
 
This would be a fairly complicated app to make though as you'd need instantaneous video feeds.

The feeds are already there. The streams are used for dashcam / sentry mode. The question is if the processor is capable processing X number of feeds and stitching them, which would mean something like 6 images per frame. You can probably size them down to a lower resolution, and then stitch. I mean you would have to do it all by yourself, and perhaps at a lower framerate, but it's certainly doable.

Image Stitching with OpenCV and Python - PyImageSearch

The biggest challenge is to make an app eco system which is sandboxed and safe security wise. It took Google and Apple a number of years to achieve this.