Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Enhanced Summon, where are you?

This site may earn commission on affiliate links.
You are talking about the static world. In the AI interview with George Hotz, he explains this well.
- Static world
- Dynamic world, where lots of actors are moving and you have to predict their behavior
- your action is going to change the way the various actors are going to change their behavior and you have to predict the changes depending on your choices

I think a great approach would be to reliably identify all objects in all camera feeds before moving on (and selling for $6000) to greater things.

As long as that system can’t differentiate a roadkill cat from a chalk painted cat from a living cat and figure out if a smudge is a lane marker or if an item is a traffic cone or if something is a shadow or a semi or if the surface is driveable or an obstacle or if something is a piece of wind blown paper or a shovel etcetcetc, it’s simply premature to dream of (and especially sell) FSD.

You know, the vision intelligence of a regular Joe. If the NN can’t do this it’s simply futile to even think about predicions (which is probably easier than reliable object detection).

If the NN plus sensor HW could identify all objects like a human can, I’m convinced that the route planning task, even if objects move dynamically, is comparatively trivial.

The autonomy day was a sales pitch. The real object detection abilities (laughably inadequate) are evident in the actual product. Maybe HW3 is the savior. Looking forward to seeing real HW3 performance (forget about the autonomy day infomercial) in the future.
 
  • Like
Reactions: ChrML and jebinc
Do you think that's true on an E[Property Value Loss] per use basis? I'm not sure, even though I am only going by videos of Enhanced Summon at this point.

Could we use that as a metric for suckage?

Well, it is easy to be safe when the feature doesn't really do anything. Sure, the current summon is super safe but that is because it doesn't really do anything. It can only creep forward and backwards a few feet. Not to mention that my phone frequently cannot connect to Summon. So yeah, a feature that is not very capable and not very reliable, really sucks. I say Enhanced Summon sucks less because it is at least more capable. it can do more than the current Summon. it can actually move around a parking lot instead of just creeping forward or backwards. Of course, with increased capability does come extra risk. But that is why Tesla needs to be more careful in releasing Enhanced Summon.
 
If the NN plus sensor HW could identify all objects like a human can, I’m convinced that the route planning task, even if objects move dynamically, is comparatively trivial.
If this were the case Waymo would be at L5 by now. You write down the rules and then we'll talk about whether it's trivial or not. When the best minds in the industry say it's the most difficult part it is weird that you dismis it as trivial.
 
  • Like
Reactions: Matias
If this were the case Waymo would be at L5 by now. You write down the rules and then we'll talk about whether it's trivial or not. When the best minds in the industry say it's the most difficult part it is weird that you dismis it as trivial.

Slightly off topic, but I believe the only way to achieve the march of 9s will be through replacing more traditional software (software 1.0 as Karpathy puts it) with more neural network derived software.

Right now the inputs to the NN are billions and billions of pixel hue values across different spaces and across time, and the outputs are a 3D representation of object recognition. Once objects are identified, the traditional software handles all of the other rules of the road.

With software 2.0, we'd have the same NN inputs, but the outputs would be nothing but steering wheel angle, accelerator, and brakes. Skipping the heuristics and directly connecting inputs to driving behavior outputs. Skip the labor-intensive object labeling and use the inputs we're putting into our steering wheels and pedals to train the NN. It's hard to imagine, and it will take a metric ton of training data to achieve, but I think Tesla is the only company positioned to achieve this with.
 
  • Like
Reactions: BANKWUPT
Well, it is easy to be safe when the feature doesn't really do anything. Sure, the current summon is super safe but that is because it doesn't really do anything. It can only creep forward and backwards a few feet.

And the car still manages to hit things on regular summon before people can react in time to stop it. That is just based on the number of "Summom hit my garage before I could stop it!" posts I have seen here and elsewhere. That may just be user knowledge issues (not knowing that you can abort by pressing a door handle even if the app stopped responding mid-summon). But super safe summon still hits stuff.

I can imagine the number of things the car manages to hit on enhanced summon will be enhanced exponentially. ;)



Now that I think about it, an empty car silently backing out of a parking space could really be a safety hazard for pedestrians in the area. (And you know everyone who uses summon to pick themselves up from the curb is not going to doublecheck for pedestrians at the car.) Hopefully the car will be really, really, really good at seeing them with the current sensors.

I know on the airfield I work a warning notice had to go out when we got some UAVs temporarily on the line. Everyone had to be warned that the aircraft may move unexpectedly at any time and to always be cautious near them. And that was a warning still needed for trained ground crew, not Joe q. Public who will unknowingly be interacting with an unmanned vehicle.
 
Of course, with increased capability does come extra risk.

So it seems like the expected value of property damage might actually be a good metric for determining suckage, relative to something we have. Enhanced Summon is a lot more capable than regular Summon, but it also has to be a lot better at avoiding things, to make sure that the expected value of property damage is less than the original Summon was (though it sounds like that may be too low a bar).

But that is why Tesla needs to be more careful in releasing Enhanced Summon.

As far as being careful, they do seem to be exercising caution. Sounds like maybe Enhanced Summon next year sometime? Based on the email that went out to EAP owners today, one could reasonably conclude they're not going to provide Enhanced Summon to EAP owners. (Though I suspect that will not actually be the case, that's the way the email reads if you take it literally!) So maybe it's just too hard with HW 2.5!
 
If this were the case Waymo would be at L5 by now. You write down the rules and then we'll talk about whether it's trivial or not. When the best minds in the industry say it's the most difficult part it is weird that you dismis it as trivial.

The reason why nobody has L5 out there is because there is no sensor plus computer system in existence on this planet that can detect all objects with the required precision to guaratee safe operation.

It’s not because they didn’t figure out how to navigate around that.

You mix up challenging vision problems with driving rules.
 
The reason why nobody has L5 out there is because there is no sensor plus computer system in existence on this planet that can detect all objects with the required precision to guaratee safe operation.

It’s not because they didn’t figure out how to navigate around that.

You mix up challenging vision problems with driving rules.

I don't think you are correct. Waymo cars with their 360 degree lidar, cameras and radar should definitely be able to see all objects around the car with very high precision. The problem is in fact driving rules, not vision. While they actually can see all objects with very high precision, they can't always anticipate what the objects will do. That's the problem. Knowing where all objects are around the car is only half the battle. The tricky part is writing the driving policies to handle all the edge cases of what the objects may do. That is what is currently holding up companies like waymo from getting to L5, not vision.
 
  • Like
  • Love
Reactions: Runarbt and Matias
And the car still manages to hit things on regular summon before people can react in time to stop it. That is just based on the number of "Summom hit my garage before I could stop it!" posts I have seen here and elsewhere. That may just be user knowledge issues (not knowing that you can abort by pressing a door handle even if the app stopped responding mid-summon). But super safe summon still hits stuff.

I can imagine the number of things the car manages to hit on enhanced summon will be enhanced exponentially. ;)

Now that I think about it, an empty car silently backing out of a parking space could really be a safety hazard for pedestrians in the area. (And you know everyone who uses summon to pick themselves up from the curb is not going to doublecheck for pedestrians at the car.) Hopefully the car will be really, really, really good at seeing them with the current sensors.

One of the things I find frustrating about enhanced summons is it feels like they skipped a step.

Regular summons and regular self park are limited to using just the ultrasonics, and those are obviously pretty blind as a results.

But, instead of trying to improve those by aiding vision they immediately jumped to working on enhanced summons to go clear across a parking lot. I think they would have been better off with a much more incremental approach.

Things like making auto park better by taking advantage of the cameras, and getting rid of the limitations on needing to have cars on both sides while reverse parking.

Or a better summons that could be trained on a specific route. Most people are going to want to summons with a very specific motion. Like I'd love to summons my car from one spot to the spot over so I can pull my other car out of the garage.

As to enhanced summons I think it's biggest issue won't be hitting things, but being so massively annoying to other drivers that fights will break out.
 
Or a better summons that could be trained on a specific route. Most people are going to want to summons with a very specific motion. Like I'd love to summons my car from one spot to the spot over so I can pull my other car out of the garage.
You can use smart summon for this !
As to enhanced summons I think it's biggest issue won't be hitting things, but being so massively annoying to other drivers that fights will break out.
Very doubtful. If it is annoyingly slow, people won't use it in busy parking lots. Like someone posted (reddit ?), it is good for work paking lots, which generally have light traffic.
 
  • Like
Reactions: Runarbt
How will L5 ever work in bad weather, when you would want it most?
One of the things I find frustrating about enhanced summons is it feels like they skipped a step.

Regular summons and regular self park are limited to using just the ultrasonics, and those are obviously pretty blind as a results.

But, instead of trying to improve those by aiding vision they immediately jumped to working on enhanced summons to go clear across a parking lot. I think they would have been better off with a much more incremental approach.

Things like making auto park better by taking advantage of the cameras, and getting rid of the limitations on needing to have cars on both sides while reverse parking.

Or a better summons that could be trained on a specific route. Most people are going to want to summons with a very specific motion. Like I'd love to summons my car from one spot to the spot over so I can pull my other car out of the garage.

As to enhanced summons I think it's biggest issue won't be hitting things, but being so massively annoying to other drivers that fights will break out.

Seems like the focus is more on releasing more parlor tricks/gimmicks than perfecting prior releases (turning prior parlor tricks/gimmicks into reliable, functional features).
 
But enhanced summon is difficult..
No, it is not difficult. It is easy.
Musk.png


Actually it allready works
summon.png
 
If this were the case Waymo would be at L5 by now. You write down the rules and then we'll talk about whether it's trivial or not. When the best minds in the industry say it's the most difficult part it is weird that you dismis it as trivial.
Yes. Waymo has been writing code for ten years and it is not yet ready. So it is definately not trivial. If it would be trivial, with Google's resourses and ten years, it would be ready.
 
  • Informative
Reactions: Runarbt
I don't think you are correct. Waymo cars with their 360 degree lidar, cameras and radar should definitely be able to see all objects around the car with very high precision. The problem is in fact driving rules, not vision. While they actually can see all objects with very high precision, they can't always anticipate what the objects will do. That's the problem. Knowing where all objects are around the car is only half the battle. The tricky part is writing the driving policies to handle all the edge cases of what the objects may do. That is what is currently holding up companies like waymo from getting to L5, not vision.

Yes. Amnon Shashua - Wikipedia says "object recognizion is solved, driving policy is the hard part"
 
How will L5 ever work in bad weather, when you would want it most?


Seems like the focus is more on releasing more parlor tricks/gimmicks than perfecting prior releases (turning prior parlor tricks/gimmicks into reliable, functional features).

Uh?! The whole point of enhanced summon is to turn a parlor trick (current summon) into something more useful and functional (navigating a parking lot).
 
  • Informative
Reactions: Runarbt
Uh?! The whole point of enhanced summon is to turn a parlor trick (current summon) into something more useful and functional (navigating a parking lot).

I had inadvertently mixed two thoughts in the same post. The first, L5 in inclement weather, was a general question. Think about a snow storm, where vision and Lidar would struggle. What would the L5 plan or solution be? GPS might also be unavailable as well. In the northern latitudes, this weather condition is frequent. Think MSP.

Re parlor tricks and gimmicks, that too was a general statement, but could be applied to Summon, and perhaps enhanced summon, depending on how well it works (based on popular opinion) once released. My opinion was based on having a lot of features and functions today that are more parlor tricks than proven reliable features. Releasing enhanced summon with only parlor trick capability would just be another example of “more parlor tricks/gimmicks...” rather than turning existing parlor tricks (e.g., all the Beta stuff) into reliable/functional features.
 
Last edited:
I need to apologize for my harsh critique.

As a member of the super exclusive power users group, I just received a V10 update and someone at Tesla messed up by including source code.

It’s for HW3 and it addresses my main grievances.

Sorry for the formatting. Messed up by the forum.


void Autonomy::Main(void) //runs for every captured frame
{
...
NN.IdentifyAllObjects(); //100% perfect. See Amnon Shashua, Andrej Karpathy et. al.
World.UpdateScene(NN.GetAllObjects()); //create 3D map of world around car. With motion vectors for all objects for prediction.

FSD.Run();
World.Randomize(Infotainment.GetAvgAmplitudeByFreq(20, 1000)); //For dancing cars. Requested by Elon 2015 for entertainment (see fart cushion spec)
//octo: please make this a configurable option. K. Thx.
...
}

void TeslaFSD::Run(void)
{
...
switch (mode)
{
case fsd_robotaxi:
//todo: add S3XY NN based path processing. ETA: 12/2020
break;
case fsd_enhanced_summon:
EnhancedSummonAlmostNotSucking.Run() //based on Joe Oldschool's outdated rules algo.
//he moved to the infotainment group to work on Atari Centipede for the arcade (less stress).
//todo: re-write from scratch with S3XY NN based approach
break;
...
}
...
}