Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Autonomous Car Progress

This site may earn commission on affiliate links.
I suspect some of the reluctance to use robotaxis stems from the lack of control that a passenger has. When using a human driven taxi, passengers have, at the least, an illusion of control in that they can talk to the driver. Perhaps, if RT companies provided some sort of interaction with the driving computer so if someone says, "watch out for that kid up there," the car could respond with, " I see the child, sir, I've determined that there is only a 12.79% chance of her running into the street in front of us, but I have my virtual foot on the brake pedal just in case." Or, it could even say things without prompting like, "Whoa! Did you see that idiot swerving in the lane? He must be drunk! I'll give him plenty of room while I livestream him to the nearest robocop."

Obviously, I'm being a bit facetious, but providing some interaction between the passenger and robodriver might be helpful.

Lack of control is definitely part of the resistance to robotaxis. I can imagine it can be a bit scary at first to be in the back seat of a car with no human driver since you have to totally trust the invisible "driver". But there is also the lack of control from people outside the robotaxi too. There was a person on Twitter who told me that robotaxis need to communicate with pedestrians to let them know that they see them and will stop so that pedestrians can know if it is safe or not to walk in front of the robotaxi. She had a bad experience with Cruise not slowing down when her husband was crossing the street, she does not feel like she can trust walking around robotaxis. This is one area where a change in mindset will be required. We are so used to interacting with other human drivers where we can give a hand signal or head nod to communicate intent. We can't do that with a driverless car. There is also the lack of control from law enforcement and first responders. Again, with humans, they can communicate and tell the human driver what to do if a road is closed etc... But with a driverless car, first responders feel helpless since they can't do anything if the robotaxi is stuck or in the way.
 
Is there any evidence to support this idea people in general are "afraid of" or "reluctant to use" robotaxis?

Because far as I know while some past narrow surveys have been all over the place on this, the few locations that offer them in any capacity have waiting lists to use them..... and I'm not aware of anybody trying to be in the robotaxi business who has expressed this-- rather than actually getting their RT working properly and in a way it's possible to scale-- as a serious business concern.


Bad analogies to evolution aside, humans tend to adopt new technology that actually works well and easily fairly rapidly.
 
Is there any evidence to support this idea people in general are "afraid of" or "reluctant to use" robotaxis?

Because far as I know while some past narrow surveys have been all over the place on this, the few locations that offer them in any capacity have waiting lists to use them..... and I'm not aware of anybody trying to be in the robotaxi business who has expressed this-- rather than actually getting their RT working properly and in a way it's possible to scale-- as a serious business concern.


Bad analogies to evolution aside, humans tend to adopt new technology that actually works well and easily fairly rapidly.
How about:

Pew Research

Many Americans are not only reluctant to ride in driverless cars, some are also concerned about sharing the road with one. In total, 45% of Americans say they would not feel comfortable sharing the road with driverless vehicles if use of them became widespread, including 18% who would not feel comfortable at all. Smaller shares indicate they would be extremely (7%) or very (14%) comfortable sharing the road with autonomous vehicles.

Those numbers will change with time, of course, as people get used to the change.
 
It's all summarized well in this nugget:

Start: I am going to die
End: This thing is better than me
That's hilarious! I remember showing off summon when I got into Beta. The car started to back out of the spot, but a family was getting out of their car, so the Tesla stopped and waited for them. As they moved away from the Telsa, the car started to move again, and the wife looked behind her and saw the car was empty and started panicking. "OMG, there's no one in the car!! What the hell is happening?" She rushed with the two kids to the side, while her husband looked into my car, which stopped as he approached it. He looked up and around and noticed me at the restaurant entrance with my phone and pointed at me and started laughing. He then gave me a thumbs up.
 
  • Like
Reactions: pilotSteve

Yes, how about it?

It suggests that even today, when there's no actual proven/widespread product on the market, 37% of Americans would ride in a robotaxi.

That's an addressable market of over 100 million people in the US alone. In other countries where personal vehicle ownership isn't so huge a cultural thing it'd probably be even higher.


Further- if you limit it to people 50 and younger you get to nearly half of Americans saying they want to ride in one (47%)- suggesting time will solve that problem pretty handily.

Further still the study finds the more people have heard about them the more likely they are to view them favorably- another example of time will solve that as once they're available to more than a tiny tiny tiny # of people in a tiny tiny tiny number of places folks will be hearing a lot more about them.


So no, I don't see anything there suggesting fear is any real obstacle to them at all, especially given how many years away a working, widespread, RT system appears to still be.
 
No one knows how long it will take for widespread adoption. How many AVs would be needed to displace the Uber drivers that operate in the San Fransisco Metro area. Even Waymo and Cruise have no idea how long it will take to offer services in every major metro area in every state. They have been testing in Miami for several years and still aren't transporting passengers there. They are testing in Dallas, Austin and Houston but how long will it take them to get to Abilene, Amarillo or Lubbock? 5 years from now will someone be able to get a Robotaxi in Omaha, Kansas City or Nashville?
 
Yes, how about it?

It suggests that even today, when there's no actual proven/widespread product on the market, 37% of Americans would ride in a robotaxi.

That's an addressable market of over 100 million people in the US alone. In other countries where personal vehicle ownership isn't so huge a cultural thing it'd probably be even higher.


Further- if you limit it to people 50 and younger you get to nearly half of Americans saying they want to ride in one (47%)- suggesting time will solve that problem pretty handily.

Further still the study finds the more people have heard about them the more likely they are to view them favorably- another example of time will solve that as once they're available to more than a tiny tiny tiny # of people in a tiny tiny tiny number of places folks will be hearing a lot more about them.


So no, I don't see anything there suggesting fear is any real obstacle to them at all, especially given how many years away a working, widespread, RT system appears to still be.
Then there is nothing left to discuss on this. You have spoken.
 
George Hotz spoke a little about self driving on Lex Friedman's podcast. They appear to be well funded (Chinese?), with good onsite resources, but comically poor on-road performance. He thinks Tesla is on the right track and has a 1-2yr lead.

 
you actually believe this video? Lmao
What's there to "believe"? It showed clearly in the video what happened. The reporter was clearly optimistic when the ride started, saying good things about how it drives, for example stopping fully at stop signs. At the green light the car had hazards on with a green light and no car in front, they showed the car screen had "our team is working to get you going". In the end the reporter called in to support to customize the location when it arrived at the wrong place and it still choose to arrive at the bottom of the hill. She was going to Randall Museum, she ended up at 155 States St (as verified by street view below) which is a 5 minute walk through a hill, so the statement checks out.

Google Maps

Are you suggesting the reporter faked it? I find that very unlikely, given even "shills" as you call them were not able to get car to halt in such a way. ABC7 is a local station here in the Bay Area with good reputation and the reporter in this story has been there since 1994. I don't see any reason to fake something like this. There are also tons of such reports of Waymo cars halting in traffic, many with video evidence, so it's not like this doesn't happen on a regular basis. That the reporter caught one instance doesn't seem out of possibility. Waymo's still a lot better than Cruise, but apparently it's not immune to random halting.
 
What's there to "believe"? It showed clearly in the video what happened.
Dude if this was Tesla the entire fan base would be giving 9,000 reasons why it's not Tesla's fault.
This was been the case with every single video and incident.
Heck we just had the whole Ross debacle and how Tesla fans came out fiercely.
I don't think there will be ever be an incident that Tesla fan will attribute to Tesla.
Its impossible for them. And look here is Tesla FSD running two stop signs in one video.
What people in this exact thread vowed doesn't happen and that it was setup by Dan.
Every time there's a video of FSD doing bad, all tesla fans come to the rescue to say its not actually doing bad or that there's this other video that disproves it or its your fault what did you expect? its just a beta! Every time. 0 accountability.


The reporter was clearly optimistic when the ride started, saying good things about how it drives, for example stopping fully at stop signs. At the green light the car had hazards on with a green light and no car in front, they showed the car screen had "our team is working to get you going".
The reporter was CLEARLY looking for a hit-pierce.
How do you not see it? But when the CNN reporter tests FSD Beta.
The entire Tesla base claim he's doing a hit-pierce and make fun of him (which he wasn't)?
But this isn't?



In the end the reporter called in to support to customize the location when it arrived at the wrong place and it still choose to arrive at the bottom of the hill. She was going to Randall Museum, she ended up at 155 States St (as verified by street view below) which is a 5 minute walk through a hill, so the statement checks out.

Google Maps
Support doesn't customize drop off location. You as a user HAVE TO MOVE the pin to the drop off location.
It was 1000% obvious she had no idea what she's doing. It was clearly user error.

Not only has Maya taken rides to that museum before and it dropped her off at the entrance. She also was able to drop a pin In-front of the museum after watching the video.

This is 10000% User Error.

Its just like Uber/Lyft app. Just because you enter an address doesn't mean the pin goes straight to where you actually want the driver to pick you up at.

For Uber/Lyft you only have to pin-point the pickup.
For Robot-taxi, you have to pin-point the pickup AND dropoff.

She didn't have the pin in the museum, so it didn't take her to the museum.
I mean dude watch it again. The support ask her to GO TO THE MAP AND FINETINE the drop off pin.
She makes a drama act hish and then says "I'm just gonna hit confirm and see what happens idk what else to do".
Clearly she didn't do it. This is hit-piece 101.

You Tesla fans wouldn't even stand for objective video/articles, let alone drama filled "MY LIFE IS OVER" dramatic hit-piece like this.
Come on. Be objective!

MQQXSMM.png


Also if you enter an address and it auto-defaults to a drop-off location (which you can always move).
If you select that drop-off location that requires walking, it will warn you that there will be a walk required which you have to confirm.
BEFORE YOUR RIDE EVEN STARTS!

Yet she acted SHOCKED and Dismayed and went on a drama filled rant when it dropped her off at EXACTLY where it said it would.

This is 100000000000000000000% a HIT piece.

7VFE821.png


Are you suggesting the reporter faked it? I find that very unlikely, given even "shills" as you call them were not able to get car to halt in such a way. ABC7 is a local station here in the Bay Area with good reputation and the reporter in this story has been there since 1994. I don't see any reason to fake something like this. There are also tons of such reports of Waymo cars halting in traffic, many with video evidence, so it's not like this doesn't happen on a regular basis. That the reporter caught one instance doesn't seem out of possibility. Waymo's still a lot better than Cruise, but apparently it's not immune to random halting.

This is 1000000000000000000000000000% a Hit-Piece. This is probably the most dramatic "news" video I have ever seen in my entire life.
Based solely on the fact she had no clue how to use the app.
 
Last edited:
  • Like
Reactions: Jeff N
Phil Duan did a presentation at the 2023 Conference on Computer Vision and Pattern Recognition and (CVPR23) and, during the question and answer section, he mentioned that Tesla is moving on from occupancy networks to something he called a foundation model.

The occupancy network itself was a big deal, and suggested to me at the time that they were going to have to rethink how to implement autonomy based on having that information. Now they're moving beyond that system to a newer, improved one, again requiring a rethinking of how to implement autonomy. Beyond that, they're messing with "end to end" work, which Phil declined to go into. I'm sure that will again require a rethinking of how to implement autonomy.

All this is confirming to me what may be obvious to others, but I think one big reason that Tesla hasn't been making a lot of visible progress with FSDb is because they spend the great majority of their time on infrastructure and basic methodologies that they need to implement the autonomy. They haven't found the secret sauce that gives them the confidence to believe they can achieve the higher levels of autonomy that they're after. On top of that, they're trying to solve a general problem of situational awareness not only for vehicle autonomy, but for robotics as well.

In terms my fellow software engineers may identify with, this smacks of constant rewrites as the engineers discover better ways to implement their product's basic technology. They never do get to the end-user scenarios that the product is supposed to implement. Instead, they're making the database, networking, security, and custom data structures function better and better through rewrites as they learn more and more about them. Meanwhile, there are a couple end-user scenarios that can do some basic stuff, but nobody spends much time on that.
 
they spend the great majority of their time on infrastructure and basic methodologies that they need to implement the autonomy. They haven't found the secret sauce that gives them the confidence to believe they can achieve the higher levels of autonomy that they're after.


If this is true in 2023 then one might argue that confidently stating back in late 2016, and continuously since, that all cars come with all the HW needed to do it was not the best idea.
 
Remember this is something FSD Beta CANNOT do yet this is 10 years behind.

What are you talking about? FSD Beta is more than capable of reading a closed road sign and rerouting. Here's a recent example at 9:15, and it does it way faster than Waymo:

If you're referencing reading hand-held stop signs, there's nothing inherently wrong with Tesla's stack that would prevent reading and obeying them, so it must be a choice Tesla has made. If they can identify roadside stop signs, they can also identify hand-held ones. Maybe they didn't want anyone with a printer and some red ink to be able to permanently stop any car with FSD Beta.