Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Model 3 Fully Autonomous ?

This site may earn commission on affiliate links.
It doesn't matter when the technology itself is available, without regulatory approval in the various markets, that feature won't be offered. Same reason the rearview cameras (in lieu of rearview mirrors) on the Model X prototypes did not make it to the production vehicles.
 
It doesn't matter when the technology itself is available, without regulatory approval in the various markets, that feature won't be offered. Same reason the rearview cameras (in lieu of rearview mirrors) on the Model X prototypes did not make it to the production vehicles.
So many people don't realize this or are choosing to ignore it.

We are a loooooooong way away from seeing any sort of commercially available "fully autonomous" vehicle. Google has been doing it for a decade+ and is nowhere near perfecting the technology. Not even close.

It is going to be so difficult to radically transition from something that has not changed since cars were invented. We can't even get rid of mirrors in favor of cameras... what makes you think we're going to get rid of human-controlled driving that not only requires a TON more cameras, but also an artificial intelligence that makes decisions based on what those cameras see?

Now, let's say you do create the perfect system. How do you convince the 50-60 year-old lawmakers that grew up before computers were even invented that it's safe and viable?

We may have a good system in a few years, but I doubt I'll ever live in a time where I can take a nap in the back seat while the car pilots itself.

Keep dreaming, people.
 
Elon Musk is pretty optimistic about the prospects, but even he says 5-6 years until it is viable, and 10 years before regulatory approval. I suspect he knows more about it than us.

I will be impressed when they can get lane following to work during a snow storm.

Thank you kindly.
 
  • Like
Reactions: Thalass
Last edited:
Do you guys think the Model 3 will be fully autonomous straight from launch ?

It could be but not turned on. Only eight U.S. states have enacted legislation to progress toward autonomous vehicle operation with half of those located in the southwest. The statistical data that Tesla has gathered look good so far, but approval and details are still way off in the future. That's the way I read it.
 
More importantly, most (all?) states have no laws against fully autonomous vehicles. The car will still need to be drivable by humans (so no removal of the steering wheel & pedals), but otherwise it's just a fancy cruise control. Law changes are really only required for a car that's going to forgo standard controls and safety equipment.

The main issue is liability when there's an accident. Car owners are going to be mad when their autonomous car gets in an accident and they find out that they're responsible for the damage instead of the car company. This is where laws need to be updated.
 
  • Like
Reactions: Luke42 and GSP
The driver will always be responsible...period. At least in our lifetimes. Look at the aeronautic industry. When I was flying regularly, once I was in the air the autopilot was able to fly to my destination including all course deviations, fly the approach, and I did not have to touch the controls until the flare for landing. However, if anything ever went wrong I was the one responsible...not the airplane. The technology for self landing airplanes has been around for a long time but the responsibility and liability is always on the pilot in command. I see autopilot autos in the same light.

Dan
 
So many people don't realize this or are choosing to ignore it.

We are a loooooooong way away from seeing any sort of commercially available "fully autonomous" vehicle. Google has been doing it for a decade+ and is nowhere near perfecting the technology. Not even close.

It is going to be so difficult to radically transition from something that has not changed since cars were invented. We can't even get rid of mirrors in favor of cameras... what makes you think we're going to get rid of human-controlled driving that not only requires a TON more cameras, but also an artificial intelligence that makes decisions based on what those cameras see?

Now, let's say you do create the perfect system. How do you convince the 50-60 year-old lawmakers that grew up before computers were even invented that it's safe and viable?

We may have a good system in a few years, but I doubt I'll ever live in a time where I can take a nap in the back seat while the car pilots itself.

Keep dreaming, people.

I agree. The lawyers haven't even gotten started on this. And if you think they're going to let this by without taking a LARGE chunk of flesh, you're all dreaming. I can see a situation where the current autopilot gradually becomes more and more capable. Eventually it will drive entire routes, stop at traffic signals, avoid other traffic and navigate. But it will still be with the proviso that you're in command and should have your hand on the wheel at all times. And then, slowly legislation will permit, and perhaps even require, "hands-off" in gradually widening sets of circumstances.

I also forsee autodrive-only lanes cropping up on major highways. And eventually large freeways will become auto-drive mandatory. These lanes/roads would be faster, closer spaced and much higher capacity. You will program your exit prior to entering the lane/road and the system will take charge of entering & exiting for you. While on the auto-drive lanes/roads manual control will be disabled.
 
  • Like
Reactions: Luke42
Okay, before practical commonplace PCs were invented. ;) The kind of computers aspiring politicians would be using. They certainly weren't supercomputing the latest poll results!

You're also in the minority. People who understand and embrace new technology from that era typically didn't go into politics!
 
I think we'll see additional autonomous features before we see an announcement for the car being fully autonomous.

This might include automated valet parking, a mechanical charging connector (as they've already demonstrated) to enable connecting/disconnecting without the owner present, etc. The mechanical connector for charging is needed, because if you summon your car from a place that it needs more range to get to, then it will have to charge along the way.

Because of this, my guess is that we'll see Supercharger 2.0 stations before we see fully autonomous driving. As you pull in, the charging cable will connect itself.

I think Elon has talked about a fully autonomous car in 2 years with a lot more built in redundancy, but I don't think that's the model 3 right away. It will likely come out on the Model S/X and then make its way down to the model 3 another 3-5 years later.

I do also think they'll have to be careful with the wording of "fully autonomous," because it will still have weather related challenges, especially with snow and ice. In those conditions, I could envision some of these features being disabled.
 
I agree. The lawyers haven't even gotten started on this. And if you think they're going to let this by without taking a LARGE chunk of flesh, you're all dreaming.
No worries though.
  • People don't want to be responsible for their self-driving cars.
  • Google has been quoted as saying that the creator of the car should shoulder the insurance burden of autonomous cars (which is easy to do, when you don't yet sell a car, and you have billions in the bank).
  • A company is a much larger chunk of flesh for the lawyers to latch onto. I think they'll be fine with such a shift.
Currently, Tesla has made no such similar statement But autopilot really is just a glorified cruise control, so it's not yet necessary for them to do so yet. We'll see which way they're leaning when we get to full-autonomy. I'm guessing they will fight to keep liability on the owner, since they probably can't afford to self-insure like Google.
 
I personally don;t
The driver will always be responsible...period. At least in our lifetimes. Look at the aeronautic industry. When I was flying regularly, once I was in the air the autopilot was able to fly to my destination including all course deviations, fly the approach, and I did not have to touch the controls until the flare for landing. However, if anything ever went wrong I was the one responsible...not the airplane. The technology for self landing airplanes has been around for a long time but the responsibility and liability is always on the pilot in command. I see autopilot autos in the same light.

Dan

Saying means that you do not think there will be fully autonomous cars. Because otherwise who is responsible when nobody is in the car? While I think it's going to take a minimum of 10-15 years to get there, saying there won't be self driving cars in the next 50 years is something I don't believe. You are talking about autopilot, which is where we are now. Full autonomy is something else entirely.
 
  • Like
Reactions: MarkS22
The technology isn't the issue, the regulatory environment is. And that environment is at least partially informed by liability issues, which is a giant hornet's nest just waiting to get poked. Tesla is already poking it with respect to autopilot, and the very name autopilot is in some sense an indication of where things stand: Tesla calls it autopilot specifically to distinguish the fact that you are still the pilot. And lurking behind that is the ugly nebula of liability, wherein gigantic lawsuits and a whole new cottage industry for lawyers may potentially lurk.

In my opinion, until the regulatory agencies address who is liable when these systems fail (and they will) and property damage or personal injury results (and it will), most manufacturers will actually be afraid to market and push these features forward in any but the simplest and most basic ways. And that, in and of itself, may also explain why no manufacturers are fielding anything remotely like autopilot; it;s not because they can't build it, it's because they don't have the temerity to expose their organizations to the liability mess that is bubbling and broiling in that cauldron, waiting for the poor soul with the fortitude (or foolishness) to ladle out some of the brew.
 
While I think it's going to take a minimum of 10-15 years to get there, saying there won't be self driving cars in the next 50 years is something I don't believe.

Thank you!! If I am still around in 50 years then I will gladly concede the point...seeing as though I am over 50 now! Never said it wouldn't happen, just not in my lifetime.

...but thanks again for the compliment!

Dan
 
The technology isn't the issue, the regulatory environment is. And that environment is at least partially informed by liability issues, which is a giant hornet's nest just waiting to get poked. Tesla is already poking it with respect to autopilot, and the very name autopilot is in some sense an indication of where things stand: Tesla calls it autopilot specifically to distinguish the fact that you are still the pilot. And lurking behind that is the ugly nebula of liability, wherein gigantic lawsuits and a whole new cottage industry for lawyers may potentially lurk.

In my opinion, until the regulatory agencies address who is liable when these systems fail (and they will) and property damage or personal injury results (and it will), most manufacturers will actually be afraid to market and push these features forward in any but the simplest and most basic ways. And that, in and of itself, may also explain why no manufacturers are fielding anything remotely like autopilot; it;s not because they can't build it, it's because they don't have the temerity to expose their organizations to the liability mess that is bubbling and broiling in that cauldron, waiting for the poor soul with the fortitude (or foolishness) to ladle out some of the brew.

In Tesla's case it's pretty clear that autopilot leaves responsibility with the actual driver. However, the NHTSA has placed the first stake for fully autonomous vehicles: On February 4, 2016, the U.S. National Highway Traffic Safety Administration (NHTSA) advised Google by letter that the artificial intelligence system piloting the Google self-driving cars can be considered the driver under federal law.

For a real fully autonomous vehicle any accident where the car's software is at fault is exactly the same as an accident caused by failed brakes. There is no human driver, so they can never be at fault. The owner of the car could potentially be at fault if the accident was due to negligent maintenance, and the manufacturer is at fault if it is due to a flaw in the software.
 
I think it will be an incremental process taking place over the next few decades. Push it too fast, and it doesn't take many serious accidents attributed to autonomous driving to unleash an army of lawyers that will force the regulations back to the stone age. It's easier to sue a person than a computer. Never mind the overall safety statistics.

Roads, especially line markings, signs and signals will have to improve to support any sort of fully autonomous driving. We'll need exceptional computer vision with high level of redundancy from sensors and systems all working perfectly in all weather conditions (3d visual coupled with radar, proximity, precise gps, etc.). I wouldn't be surprised if some roads will be certified or even designated for autonomous use, and others will not.
 
  • Like
Reactions: Booga
We are a loooooooong way away from seeing any sort of commercially available "fully autonomous" vehicle. Google has been doing it for a decade+ and is nowhere near perfecting the technology. Not even close....Keep dreaming, people.

A little factual information and hard data goes a loooooooong way when it comes to figuring out how close Google is to perfecting self-driving technology.

Have a look at this annual report from Google required by the State of California:
Request Rejected

In the 4th quarter of 2014, their self-driving cars required operator intervention due to a failure in the autonomous technology every 785 miles. By the 4th quarter a year later in 2015, that number was up to 5,318 miles. See the following graph:



So, in the quarter ended almost 4 months ago, Google's cars drove on average 5,318 miles between incidents where the autonomous system had to be overridden by the operator.

Please elaborate on how this is "not even close" to perfecting self-driving technology.

Concerning "We are a loooooooong way away from seeing any sort of commercially available fully autonomous" vehicle... I guess it depends on your definition of loooooooong. Looks like Singapore will have commercially available autonomous vehicles later this year. Very loooooooong wait indeed ;)

nuTonomy to Test World's First Fully Autonomous Taxi Service in Singapore This Year - IEEE Spectrum

RT

P.S. Not very smart to bet against the software guys. They have all the money now...