Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Model 3 Fully Autonomous ?

This site may earn commission on affiliate links.
But to take a different point of view of customer acceptance. You talk about somebody who could drive their own car choosing a no-control driverless car, but how about the ability to not own a car and summon one at any time, the ability of an elderly person no longer able to safely drive, and person coming home after a party with a couple too many drinks to be able to get to where they need to go without the incredible expense that goes along with having human taxi drivers. Doesn't having those abilities compensate for some of the other tradeoffs? The question is at what stage a car is considered to be reliable enough to be able to meet those needs, where having human takeover simply isn't possible.
And that's fine for the old people and the people that want to summon cars, you should just have to accept that their are going to be some cases where the computer is not going to know what to do and then your going to be stuck. Those who want to drive should be able to. There will be computers to make it just as safe as a drverless car and to take over in crazy traffic to keep it flowing.
 
Or when a 4-way traffic light is out with heavy traffic, but drivers are not moving through in any order.

This is one of the cases I thought of where people do badly, because by law you are supposed to take turns. I find it very frustrating how badly people handle this situation. I think a computer car could probably do OK, though. It creeps forward when it thinks it should and observes how the other cars react.

Here's some completely predictable situation where a human would always succeed yet an autonomous car would fail:
After some heavy rain, a shallow stream runs through crossable road / A not-so shallow stream is runnng through the road

Given the number of idiots who drive their cars into streams crossing the road and end up needing to be rescued from their flooded cars every winter around here I again dispute the fact that humans handle this at all well.

Road near construction site has a bunch of nails scattered
Cop simply says clear the area or go back the way you came
Narrow parking decks with various unmarked entrances and exits, where drivers must take turns moving
Access gates with remote attendants
Blind driveway
Sign or light at same angle as sun or reflecting sun
Something insubstantial like a thin piece of electrified wire is dangling ahead in car's lane
A tornado is crossing the road just ahead

I think the toughest ones where you have a mix of autonomous cars and humans is where people figure out what to do through direct communication. They look at each other or motion. Although there would be a learning curve, at least autonomous cars should behave more predictably than humans. A lot of the other ones are simply a matter of having good enough sensors. Theoretically sensors could do a much better job than a human at sensing things like a thin piece of electrified wire or a tornado and or even something silhouetted against the sun.

I don't expect this to happen in 2 years or 5 years, but I don't see why all of the problems mentioned here couldn't be solved in 10 or 20 years. So I'm between the people who think that the Model 3 will be ready for fully autonomous driving and those who don't think it will happen in current people's lifetimes. While I imagine my 6 year old son will learn to drive I don't think it will be long after where you start getting large numbers of people who never do.
 
And that's fine for the old people and the people that want to summon cars, you should just have to accept that their are going to be some cases where the computer is not going to know what to do and then your going to be stuck. Those who want to drive should be able to. There will be computers to make it just as safe as a drverless car and to take over in crazy traffic to keep it flowing.

Yep sure, although that's spoken by people that have already taken the trouble to learn how to drive and want to exercise that skill. If autonomous cars become common people will probably eventually stop bothering to learn (actually the number of young people learning to drive has already been steadily declining for years).
 
I think this thread has gotten out of hand with some absurd situations which humans fail miserably at and sometimes die. If the computer determines that something is unsafe then it's probably a good chance that it is unsafe. Can a computer generate an alternate route when need be?: Yes. Does an AI need to see ALL possible situations before being able to make correct decisions?: No, that's not the point of AI. AI tries to figure it new inputs and situations.

ttupper92618 has an example of helping someone dig their car out of the snow... obviously that person you helped would act exactly like the autonomous car, I'm assuming that person was human right? Would you extrapolate and say all humans shouldn't drive because that person wasn't able to get out? That doesn't make any sense. Can an autonomous car call for help like a human could?: You bet it can.

Human's often make poor decisions. If an autonomous car pulls over to the side of the road because conditions become unsafe then if a human is in the car then they can always override this, but if not why would the car attempt continue? Why not pull over, alert the owner to the location, situation, and wait it out?

I think there's some lack of knowledge here on the current state of the technology. Human to human communication for example is being looked at in numerous contexts already, Google cars can interpret cyclists hand gestures, the Mercedes concept car can alert pedestrians they can pass, etc.

Autonomous driving is coming and it's coming fast. You won't have to wait 20 years. There will be millions on the road by 2020.
 
  • Like
Reactions: RubberToe
It could be but not turned on. Only eight U.S. states have enacted legislation to progress toward autonomous vehicle operation with half of those located in the southwest. The statistical data that Tesla has gathered look good so far, but approval and details are still way off in the future. That's the way I read it.

The data collected by Tesla today has little benefit in achieving full autonomy because it does little to solve the many edge cases. The data contributes to good lane keeping, which is primarily mobileye's invention.
 
On the contrary, the data collected by all mobileye partners is aggregated. They've been collecting data for YEARS. They have tens of millions of miles full of situations and human responses.

That's definitely more driving than I've personally done in my lifetime.
 
Tesla has made it clear that they are keeping their own database. Mobileye's desire is to aggregate data from all users because it is in their interest to do so. It is not in Tesla's interest ,as the technology deployment leader, to be "one of the gang".

Tesla's deal with mobileye is likely:

1) Tesla does not share its user data with mobileye, and is working to add other (non-mobileye) data to the dataset.
2) Tesla pays mobileye for its IP related to car location
3) Mobileye stops talking about Tesla.

Tesla and mobileye are natural frenemies: They have both shared and competitive interests.

But as I said, this data is clearly used primarily for lane keeping. Tesla can derive statistics such as autopilot miles driven per air bag deployment, but this does little to solve the many difficult problems encountered when attempting to implement a fully autonomous car.
 
I think there's some lack of knowledge here on the current state of the technology. Human to human communication for example is being looked at in numerous contexts already, Google cars can interpret cyclists hand gestures, the Mercedes concept car can alert pedestrians they can pass, etc.

I didn't say it wasn't being worked on, obviously it is. I said it was a tricky problem. There are a wide variety of ways people communicate with each other, and especially getting people to understand what the car's intentions are is a difficult problem. That said, humans don't always communicate successfully with each other, and I'm sure it's a solvable problem.
 
@JeffK

Do you really have such a crappy car that you hate driving that much? Why do you want everyone to ride around in a mindless autonomous pod? Why can't you see that many people want and NEED to be able to control their vehicles? The computer will make it perfectly safe and to regulate traffic so what's the deal?
 
  • Love
Reactions: jkk_
Human's often make poor decisions. If an autonomous car pulls over to the side of the road because conditions become unsafe then if a human is in the car then they can always override this, but if not why would the car attempt continue? Why not pull over, alert the owner to the location, situation, and wait it out?

I thought I had made it pretty clear that I believe we will see autonomous vehicles soon. What I don't believe is that we will see wide-spread adoption of fully autonomous vehicles that do not allow human operator input. You can think of this as a contrast between Google's model and Tesla's (current) model. Google wants to entirely eliminate the human as operator from the equation; their next generation vehicles don't even have a steering wheel, as an example.

The edge cases you seem intent on dismissing are actually the most important ones, because they frequently represent life or death situations. Like this one, as an example:


It was doubtless unsafe to drive in those conditions. However, NOT doing so would have been even more unsafe. And it is this aspect of human judgement which is the major issue with respect to fully autonomous vehicles.

It all boils down to the model we champion. I think Google's model - and the model of all of those who pontificate about how humans are dangerous and should not be allowed to operate a vehicle - is a fundamentally naive one that doesn't consider the real impacts and outcomes. The more likely outcome is "creeping automation" - automation that assumes more and more of the driving task, but which may not ever fully replace human drivers in all contexts.
 
  • Love
Reactions: jkk_
@JeffK

Do you really have such a crappy car that you hate driving that much? Why do you want everyone to ride around in a mindless autonomous pod? Why can't you see that many people want and NEED to be able to control their vehicles? The computer will make it perfectly safe and to regulate traffic so what's the deal?

I'm confused. Since you didn't quote a post I went back over the whole thread and couldn't find a single post where he suggested that non-autonomous driving should be outlawed. This is the second time you've made a similar post, but you seem to just be threatened by the idea that there might be people who don't ever want to drive themselves.
 
  • Like
Reactions: JeffK
As I said my wish for the model 3 is simply to cover not only highway driving, which the Model S does now, but to cover city driving and neighborhoods to prevent accidents and take care of monotonous driving such as driving to work in the morning when I'm tried and probably don't have the best judgement and fastest reflexes.

As far as total 100% autonomy with no steering wheel, that's not for me. You'd have to pry the Model 3 space ship like steering controls from my cold dead hands and wipe the smile off my motionless face. I'm looking forward to Model 3 Ludicrous mode with so much anticipation, it's stressing me out.

In most futuristic sci-fi movies with autonomous driving the main character always has to engage manual controls before the real action ensues.
 
  • Like
Reactions: jkk_
I thought I had made it pretty clear that I believe we will see autonomous vehicles soon. What I don't believe is that we will see wide-spread adoption of fully autonomous vehicles that do not allow human operator input. You can think of this as a contrast between Google's model and Tesla's (current) model. Google wants to entirely eliminate the human as operator from the equation; their next generation vehicles don't even have a steering wheel, as an example.

The edge cases you seem intent on dismissing are actually the most important ones, because they frequently represent life or death situations. Like this one, as an example:


It was doubtless unsafe to drive in those conditions. However, NOT doing so would have been even more unsafe. And it is this aspect of human judgement which is the major issue with respect to fully autonomous vehicles.

It all boils down to the model we champion. I think Google's model - and the model of all of those who pontificate about how humans are dangerous and should not be allowed to operate a vehicle - is a fundamentally naive one that doesn't consider the real impacts and outcomes. The more likely outcome is "creeping automation" - automation that assumes more and more of the driving task, but which may not ever fully replace human drivers in all contexts.

I agree with the creeping automation. However, I think we look at things from the opposite sides. You say that having a vehicle that doesn't allow a human to take over is undesirable. I don't really care whether a human can take over. The point is whether the car requires a licensed, capable and alert driver ready to take over at any time. Because having cars that don't require that would be a really huge advance and allow all sorts of things that aren't possible today. And the level of reliability and risk required of such a vehicle is something that people would need to decide. The point for me is not whether it is desirable, because it seems abundantly clear to me that it definitely is. The question is how long will it take before it is possible. The creeping automation case there is that we will probably start with fully driverless cars in cities and slowly expand where they can travel.
 
I agree with the idea that autonomous car will mostly be copilots, the system will alert or completely take over if imminent danger is detected, but for the most part it will be there running on the background making minor adjustments without the human driver noticing them. Sure in 5 years, it might be possible for a car to fully drive itself, yet most people will not active fully autonomous driving because they will not trust it, even if under most circumstances it's the safest alternative.

Technology advances faster than what human are able to adapt to it. If fully autonomous driving becomes widely accepted and use it won't be because it's safer or because legislation allows it, but because human will trust it, which I think will be the newest generations, those who grew up with fully autonomous cars.
 
I'm confused. Since you didn't quote a post I went back over the whole thread and couldn't find a single post where he suggested that non-autonomous driving should be outlawed. This is the second time you've made a similar post, but you seem to just be threatened by the idea that there might be people who don't ever want to drive themselves.
I have talked to some people that feel that their should be no human controls like steering wheel and pedals installed and they think that that way is the best way and everyone should follow suit. That's the impressions that I got from him due to the way he argued some of the points made. Some people feel that if you want to drive then your out of luck in the future and yes I do feel threatened by them because driving is something I enjoy doing deeply. But if I assumed incorrectly of you then I apologize.
 
I don't care how advanced the cars become, even if they get to the point to them being able to drive without anyone in them. I just want to be able to control my vehicle for the fun of my sports car, the utility of my truck, and the flexibility knowing that there WILL be times where the computer is not going to know what to do. I still want to be able to put my car in autopilot and be able to fully drive on it's own, but them switch it of when i want to and the computer keep it safe and mitigate traffic in the background.