Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Elon confirmed addressing my biggest Tesla software deficiency: Tight Parking/360 view

Battpower

Supporting Member
Oct 10, 2019
1,968
1,942
Uk
It’s effective resolution is not 10 inches. That’s just the closest an object can be for it to consistently register. It absolutely can distinguish between, for example, 11 and 12 inches.
Sorry, not being very careful with my words.... Below 10" you just get told "stop" so can't tell between 6", 7", 8"etc...

Of course you'd be sensible to stop at 12", but if it can reliably indicate 11" vs 12" it would be nice it it could keep going down to say 7".

When I typed 'effective resolution 10", I was saying that from the perspective that I don't care about 20", 15" or 12" other than as a rough guide that my eyes can often tell equally well.

It's up close that it makes most difference.
 

whitex

Well-Known Member
Sep 30, 2015
6,455
7,646
Seattle area, WA
Presumably the ultrasonics will pick up said suicidal, unattended dog. Sure, you won’t get a picture of the blur as the dog runs past a camera, but some other warning could suffice in that case.
Actually, untrasonics will not even pick up a large duffel-bag in front of the car, so probably not the dog either. The issue here by the way is less that ultrasonics will not pick it up, but that the camera would show an all-clear to the driver, which is worse. It's like if the car had a feature to tell you whether the traffic light is red or green, but whenever it can't figure it out it tells you it's green rather than it doesn't know.
 

dannycamps

Member
Apr 8, 2019
678
618
Northeast USA
I recently saw a video for a car (don't recall which one) but it not only had a traditional overhead 360 degree camera but also an alternate view where the car appeared on screen as if you were standing outside of it watching it from a distance.

I'm not even sure how that one was even possible.
 

pilotSteve

Active Member
Jul 14, 2012
1,473
1,337
Prescott Az
360 degree "surround" image is synthesized by splicing multiple camera images (with appropriate splicing/cropping to have the resulting composite image show more-or-less full coverage. Its not magic, its PFM software with multiple overlapping sensors.
 

BHCLUC

Member
Nov 7, 2017
232
168
Chicago
I recently saw a video for a car (don't recall which one) but it not only had a traditional overhead 360 degree camera but also an alternate view where the car appeared on screen as if you were standing outside of it watching it from a distance.

I'm not even sure how that one was even possible.

I think this is BMW.
 

pgkevet

Active Member
Jul 1, 2019
1,167
1,026
mid wales
Actually, untrasonics will not even pick up a large duffel-bag in front of the car, so probably not the dog either. The issue here by the way is less that ultrasonics will not pick it up, but that the camera would show an all-clear to the driver, which is worse. It's like if the car had a feature to tell you whether the traffic light is red or green, but whenever it can't figure it out it tells you it's green rather than it doesn't know.

I live down a single track lane. One night on the way home an old local stray dog was wandering around. I know him and he's deaf and a bit daft. I pulled up and I only know he walked in front of the car because of the change in headlight reflection off the road. Nothing from the ultrasound and too close to be visible at the front bumper and not visible to the cameras (I checked).
 
  • Informative
Reactions: Cheburashka

dikdastard

New Member
Oct 22, 2019
3
1
Blighty, uk
I live down a single track lane. One night on the way home an old local stray dog was wandering around. I know him and he's deaf and a bit daft. I pulled up and I only know he walked in front of the car because of the change in headlight reflection off the road. Nothing from the ultrasound and too close to be visible at the front bumper and not visible to the cameras (I checked).
Yes, I can also confirm its the same case for chickens who literally like to "play chicken" on our property. Our Tesla would be perfectly happy to drive over them :O
 

PhilDavid

Active Member
May 22, 2018
2,552
1,839
Philadelphia
It’s effective resolution is not 10 inches. That’s just the closest an object can be for it to consistently register. It absolutely can distinguish between, for example, 11 and 12 inches.

The problem is I regularly park in places with less than 10 inches of clearance and I wish the ultrasonic sensors at least went down to about 6 or 8 inches. Those extra 2-4 inches is when I need the sensors the most.
 

whitex

Well-Known Member
Sep 30, 2015
6,455
7,646
Seattle area, WA

Here's the BMW 360.
It's not just BMW. Here is Audi:

and even non-premium brand Toyota:

and probably every other mainstream car manufacturer has similar features in their Tesla price range cars.

This isn't anything new outside of Tesla universe, where people get excited to have their car read text messages, something other cars did more than 10 years ago.
 
  • Informative
Reactions: PhilDavid

E-Ryc

Member
Jun 6, 2018
154
115
Prague, CZ (EU)
Regarding ultrasonic and minimum range - it's a principal problem of the transducers (the thing in the bumper) used in cars - it's used both as speaker and microphone. They generate the burst (speaker mode) and then need to wait for some time to stop "shaking" to be able to act as microphone (not to catch up those vibrations but the actual echo). And this time equals to 2x10 inches (there and back).

Regarding 360 view - I understand "original" TMS did not get it (they had another things to solve). But WTF it's not available (at least as optional package) in today's cars? Esp. as you can have it in "low cost" cars like Leaf at the end...
 

PhilDavid

Active Member
May 22, 2018
2,552
1,839
Philadelphia
Nissan Leaf also has it. Which is way below Tesla's price range.

The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.
 

emmz0r

Senior Software Engineer
Jul 12, 2018
1,169
931
Norway
The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.

Elon promised tesla apps and an SDK. I wish they did offer that so people could make their own :)
 

PhilDavid

Active Member
May 22, 2018
2,552
1,839
Philadelphia
Elon promised tesla apps and an SDK. I wish they did offer that so people could make their own :)

This would be a fairly complicated app to make though as you'd need instantaneous video feeds.

The thing is Tesla's FSD visualization will need to generate exactly what we are looking for when navigating tight parking garages with pillars, poles, other vehicles, etc.,

To some extent what we are asking for with 260 view is something the Tesla software will have to build or itself for FSD.
 

Battpower

Supporting Member
Oct 10, 2019
1,968
1,942
Uk
The thing is that the Model S is a much wider car so could use 360 view a lot more than smaller cars that have it.

I really hope they figure out something with an augmented view that stitches in what the cameras see and have seen to create a visual representation of what is around the car in tight parking situations.
Yeah Elon. Just pretend your Rover is stuck on Mars and your software team have to get the [email protected] old sensors to give 360 view. You can do it!
 

emmz0r

Senior Software Engineer
Jul 12, 2018
1,169
931
Norway
This would be a fairly complicated app to make though as you'd need instantaneous video feeds.

The feeds are already there. The streams are used for dashcam / sentry mode. The question is if the processor is capable processing X number of feeds and stitching them, which would mean something like 6 images per frame. You can probably size them down to a lower resolution, and then stitch. I mean you would have to do it all by yourself, and perhaps at a lower framerate, but it's certainly doable.

Image Stitching with OpenCV and Python - PyImageSearch

The biggest challenge is to make an app eco system which is sandboxed and safe security wise. It took Google and Apple a number of years to achieve this.
 

artsci

Sponsor
May 10, 2012
6,256
3,234
Timonium, Maryland
Third party hardware and software has been available to do this for more than a year. I have it own my Model S as do dozens of others. See this video for its operation:


This can be purchase from BearBu, who has company in Hong Kong that developed it all. See this thread.
 

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top