DirtyTesla on Twitter
David F Guajardo on Twitter
AB on Twitter
Roddie Hasan - راضي on Twitter
Tesla released this feature less than 24 hours ago. It has only been rolled out to a portion of the fleet. There have already been multiple confirmed accidents.
This is dangerous garbage. Even when it "works," the videographers make piles of apologies for its poor and erratic behavior -- even in completely empty parking lots.
The only people whose "minds are blown" are people who have never been in an actual L4 prototype vehicle. There are multiple L4 companies that together have driven ~20 million miles, with fewer at-fault accidents than Enhanced Summon has had in 24 hours.
I have instructed my family to literally run away from any Tesla that seems to be driving oddly. I have a 14 month old toddler, and I literally fear for her life now.
Generally speaking when you post a bunch of links to videos they should back up the main point you're trying to make.
Is Smart Summons garbage or is it dangerous? If its garbage people won't use it, and hence it's not dangerous.
It seems like you're trying to make the argument that it's both.
The first link shows that it's garbage, but it was a weird place to test it. I don't feel like Tesla had that use case in mind.
The second link doesn't show anything, and is simply an accident that wasn't the fault of the Tesla. But, it wouldn't have happened to a human driver as the human driver would have honked. In your post you mentioned "at-fault accidents", but you didn't post a link to any videos of at fault accidents.
The third link shows damage to the car, but we have no idea what the owner was attempting.
The fourth link shows smart summons being dangerous. It was almost an at-fault accident, but everyone stopped before it happened.
So of those 4 videos only one really shows it being dangerous. There are tons of videos that are better than the first link you posted that shows it being stupid. So I'm not sure why you chose the first one.
I'm not going to make any excuses for Smart Summons as in my own testing it was neither consistent enough nor was there any confidence building behavior to lead me to trust it to go beyond the empty parking lot testing that I was doing. There is plenty of evidence that shows it being inconsistent, and it failing to pick a path that a human driver would.
But, is it dangerous? I'd say it is highly dependent on what the owner is going to ask of it.
Will the owner abide by the requirement to be in visual site of the vehicle, and will they monitor the vehicle as it drivers towards them? Will they identify risk factors like toddlers and small children on the route especially at the start (where a small kid might be hidden from the cameras).
With a responsible owner I don't believe its all that dangerous. At least no more so than a typical average parking lot danger level. Sure there is a slight delay to stopping when releasing the button, but pedestrian detection seems solid.
The fourth link you posted is the best one at arguing a counter point to my argument. Where you can use it to demonstrate the cars visual system not identifying a car coming the other way. I can't explain why it failed.
The L4 driving that you mentioned isn't applicable to what Tesla is doing. The sensors, and the situations the cars are being tested in are completely different. They are not commercial released vehicles, and they are way more expensive than anyone could afford even if they were somehow released.
If I had a 14 month year old toddler my fear of parking lots would already be at a level 5 out of 5. I simply don't see a Tesla adding to that.