Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Crash with needs to be smarter summons

electronblue

Active Member
Oct 1, 2018
2,325
2,411
Earth
I think two different possible vulnerabilities are on display here:

1. Smart Summon is unable to anticipate and react to other vehicles in a way human could. In the first case a human driver could have honked a horn or even noticed just the reversing lights and not moved when another person was backing out. This is something Tesla can develop actions for over time but this is reminiscent of the hard time even Waymo is having with close-up scenarios and how to maneuver in them.

2. If accurate, hitting a part of the garage when driving forward is potentially the result of two things: the NN might be unable to recognize such obstacles (and ultrasonics are notoriously bad with protruding objects and thin shapes like shelves on walls). It is also possible the known camera blindspot around the forward quarter of the car made it hard to see such objects. This is something many of us have viewed as a much harder problems because the camera blindspot are there and there is only ultrasonic backup as a limitation of the suite.

The first case provides an interesting take also on the problems facing human supervision of semi-autonomous features. In this case the driver was supervising his car but from the vantage point where he stood, it was impossible to see in time the car that started backing out — and the only control he had was stopping his own car by releasing the button. A remote honk might have been helpful but even then there may not have been sufficient time, so it would have had to be automated... but since it was semi-autonomous we can’t trust the automation. Difficult case.
 
Last edited:
B

banned-66611

Guest
These accidents show the fundamental flaw in summon: the driver doesn't have enough visibility to prevent accidents.

The drivers didn't see the stationary objects and prevent crashing into them. The one who hit another car must not have seen it either, or didn't react fast enough. The idea that you can drive a car remotely from your phone is insane.
 
  • Disagree
Reactions: KG M3 and bhzmark

electronblue

Active Member
Oct 1, 2018
2,325
2,411
Earth
These accidents show the fundamental flaw in summon: the driver doesn't have enough visibility to prevent accidents.

The drivers didn't see the stationary objects and prevent crashing into them. The one who hit another car must not have seen it either, or didn't react fast enough. The idea that you can drive a car remotely from your phone is insane.

Some of the cases may also show another flaw which is without downward facing cameras the car has many blindspots when it comes to nearby objects, especially around the front quarter.
 

ramonneke

Active Member
Apr 26, 2018
3,548
2,038
Rotterdam
the NN might be unable to recognize such obstacles (and ultrasonics are notoriously bad with protruding objects and thin shapes like shelves on walls). It is also possible the known camera blindspot around the forward quarter of the car made it hard to see such objects.

What I think happened is that the environment mapped was lost due to the car being parked. Then summon was started and rebuild the environment based on what it could see. Although it is strange as the B pillar cams should have noticed its view was blocked.
 
B

banned-66611

Guest
What I think happened is that the environment mapped was lost due to the car being parked. Then summon was started and rebuild the environment based on what it could see. Although it is strange as the B pillar cams should have noticed its view was blocked.

What environment map? All the evidence we have so far is that the system relies on sensors only. The convoluted route the car takes sometimes seems to back that up, i.e. it's not doing any route planning based on knowledge of the car park layout.
 
B

banned-66611

Guest
This environment map on your phone with Smart Summon:

View attachment 460982

Ah, I thought you meant the self-driving environment map that people keep claiming exists, without any evidence.

So this is just the ultrasonics. It's basically what your robot vacuum cleaner can see - raised obstacles.

It's probably a really bad idea to show it. All it does is give the driver a false sense of security, making them think they can see what is happening when in fact the ultrasonics are not at all reliable enough for this.
 
  • Informative
Reactions: APotatoGod and Tam

emmz0r

Senior Software Engineer
Jul 12, 2018
1,216
994
Norway
Ah, I thought you meant the self-driving environment map that people keep claiming exists, without any evidence.

So this is just the ultrasonics. It's basically what your robot vacuum cleaner can see - raised obstacles.

It's probably a really bad idea to show it. All it does is give the driver a false sense of security, making them think they can see what is happening when in fact the ultrasonics are not at all reliable enough for this.

Do ultrasonics go that far? I strongly doubt that. I think it's the real camera based 3D map but the logics behind using that map is not great.

upload_2019-9-30_19-35-57.png
 
Last edited:
  • Like
Reactions: mongo

diplomat33

Well-Known Member
Aug 3, 2017
7,785
9,106
Terre Haute, IN USA
Do ultrasonics go that far? I strongly doubt that. I think it's the real camera based 3D map but the logics behind using that map is not great.

According to Tesla's own AP2 diagram, the ultrasonics have a max distance of 8 meters. So it could be the ultrasonics depending on the size of the parking lot. But I would assume that the cameras could supplement the ultrasonics for things like a moving car or pedestrian walking in the path or confirming if something is a curb. After all, we know Smart Summon was delayed in part because getting the cameras to detecting curbs reliably was proving more difficult than originally anticipated.

800px-Tesla_AP2_Hardware.jpg
 
  • Informative
Reactions: Big Earl

mongo

Well-Known Member
May 3, 2017
13,102
39,900
Michigan
Ultrasonics do not typically have any resolution. Either there is a return in their beam or their isn't. That point cloud can't be pure ultrasonic.
You might do some stuff with signal processing and linking the all the sensors, but you would need a wider than normal beam to make that work (poorly).
 

thewishmaster

Member
Jun 4, 2018
547
429
California
Ah, I thought you meant the self-driving environment map that people keep claiming exists, without any evidence.

So this is just the ultrasonics. It's basically what your robot vacuum cleaner can see - raised obstacles.

It's probably a really bad idea to show it. All it does is give the driver a false sense of security, making them think they can see what is happening when in fact the ultrasonics are not at all reliable enough for this.

Ironically robot vacuum cleaners usually use a range-finding laser scanner (not quite the same as the lidar in cars), with higher precision than ultrasonics would provide, though I'm not familiar with all types of vacuums. There is enough evidence that Smart Summon uses cameras for both navigation and obstacle detection (e.g. it'll show "waiting for pedestrians" if you walk in front of it, but not if there's a non-pedestrian obstacle).

Not sure if it has any prediction stuff yet (e.g. predicting paths of potential obstacles) though.
 
  • Like
Reactions: Big Earl

About Us

Formed in 2006, Tesla Motors Club (TMC) was the first independent online Tesla community. Today it remains the largest and most dynamic community of Tesla enthusiasts. Learn more.

Do you value your experience at TMC? Consider becoming a Supporting Member of Tesla Motors Club. As a thank you for your contribution, you'll get nearly no ads in the Community and Groups sections. Additional perks are available depending on the level of contribution. Please visit the Account Upgrades page for more details.


SUPPORT TMC
Top