Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Model 3 Fully Autonomous ?

This site may earn commission on affiliate links.
I think it will be an incremental process taking place over the next few decades. Push it too fast, and it doesn't take many serious accidents attributed to autonomous driving to unleash an army of lawyers that will force the regulations back to the stone age. It's easier to sue a person than a computer. Never mind the overall safety statistics.

Roads, especially line markings, signs and signals will have to improve to support any sort of fully autonomous driving. We'll need exceptional computer vision with high level of redundancy from sensors and systems all working perfectly in all weather conditions (3d visual coupled with radar, proximity, precise gps, etc.). I wouldn't be surprised if some roads will be certified or even designated for autonomous use, and others will not.
I would agree with this. I think you're probably spot on about certified roads. I'm envisioning a specific lane on highways with sensors that support autonomous driving. This would help with things like bad weather when not all of the road markings can be identified.
 
Also, I don't think anyone can yet predict what can take place when multiple autonomous cars are sharing the same roadway in a close proximity, all going at a high rate of speed. When something unpredictable happens (such an animal or a person running into the lane), how will the whole "train" of fast moving autonomous cars react as a whole? We may see massive chain reactions when cars just continue to pile on top of each other, since the system cannot see far enough to make an observed decision to proactively pull over if they see brake smoke ahead. This may be improved by having some sort of communication between the cars so other cars coming up behind will get a fair warning to start slowing down. It's one thing to slow the car down by radar (like today's autopilot), but entirely another for 20 fast moving autonomous cars trying to swerve around an accident scene. This would be scary, like a train without the tracks.

Also, any time systems are using same or similar algorithms there is potential for massive oscillations through the systems.

Sorry, I'm just thinking out loud.
 
I would agree with this. I think you're probably spot on about certified roads. I'm envisioning a specific lane on highways with sensors that support autonomous driving. This would help with things like bad weather when not all of the road markings can be identified.

Booga,
I don't believe that the implementation of self driving will require "any" additional elements added to the road (i.e. embedded sensors, etc.), the cost would be too high. Everything needs to be in the car, since there would be no way to change any significant portion of the roads to comply with what "the car" needs.

I used to wonder about how cars would handle things like wet pavement, where you literally cannot see the lane markings, or snow covered roads where everything is under a couple inches of snow. People even have trouble driving in those conditions. I read recently (forgot where) exactly how they are able to get the cars to work in those conditions. It's not GPS, since that can be off by many meters. The way they do it is that the cars that drive around before hand take high resolution LIDAR of the surroundings, and also high definition camera images of the surroundings. A combination of the two is used. If there are no lane markings visible, the software compares the current camera video feed to prior recorded video to compute distances to a set of known objects in both images. Things like street signs, benches, trees, even surrounding buildings. So even if there are no road markings visible, this level of processing can pinpoint a vehicles location to within something like 5 cm.

So if you know where the car is within that error margin, and you know where the road and all road markings are within that error margin or better, then any road markings are nothing more than a "nice to have".

I'm not saying this solves all problems, just illustrating how road markings are not as important as they might first seem.

And having said all that, I do believe that the rollout will be incremental. Perhaps happening on selected freeway lanes (carpool lanes), then migrating over to the other freeway lanes. Then making their way to non-freeway roads as the software improves. I could also envision limited geographic "autonomous regions" where cars are allowed to operate. Think "last mile to the train station" kinds of vehicles, where the roads they use to get to the destination are mapped out well in advance. I know for a fact that Uber is heavily involved in just this.

RT
 
  • Like
Reactions: Luke42
I also wonder if all 48 contiguous states would need to enact legislature? Consider a trip from New York to Florida and one of the states has not enacted legislature. What then when you have an accident in a state with no autonomous driving and it's determined the car was running autonomously?

Or, the system is smart enough to tell you that you're in a state, where you need to grab the steering wheel, yolk, joy stick or whatever it's called by then.
 
As Elon has mentioned more than once, level 4 autonomy will be available in ~2 years (same time M3 will be produced). With another 1-2 years for legislation to pass.

Now with that being said and after doing research on the software aspect, I think 2 years will be plenty of time to have the hardware and software capable of level 4 autonomy (not level 5). Tesla has logged millions of miles thus far with autopilot on the S and X. I wouldn't be surprised if Tesla has more data logged each month than google has over 6 years. This data is KEY for legislation being passed. 1-2 years seems reasonable considering the massive amount of data being logged statistically proving autonomy is far better than human drivers by multiple factors.

Even if this time frame is 2x as long due to EM's sometimes overly optimistic schedule, we are looking at 2022 for lvl 4 legislation approved autonomy. Has anyone seen mobileeye's upcoming tech presentations? Mind blowing to say the least.
 
  • Like
Reactions: callmesam
will be impressed when they can get lane following to work during a snow storm.
I was impressed how well autopilot worked in a recent Houston-area rainstorm. Visibility was low and AP worked as traffic slowed from 65 to 20 mph.

I agree with those who predict that autonomous driving HW will soon (3-5 years) be in place and limited areas will permit full exploitation. Until it is legal everywhere, safety aspects will be mandatory on all new vehicles. Current Tesla-level autopilot features will be commonplace in 5 years of so.
 
Does no one here actually like to drive? I mean i think it would be cool for the car to drive itself sometimes, but i still want to be able to when i want. Im afraid of self driving cars quickly becoming mandatory. Could we not have it where you have choice to drive when you want to just with the computers making it it impossible to crash? Could that not be a possible compromise?
 
Does no one here actually like to drive? I mean i think it would be cool for the car to drive itself sometimes, but i still want to be able to when i want. Im afraid of self driving cars quickly becoming mandatory. Could we not have it where you have choice to drive when you want to just with the computers making it it impossible to crash? Could that not be a possible compromise?

You aren't the first one to state this, and it makes me curious: what sort of factors do you think would drive such technology to be mandatory? I don't see such a thing as at all likely in a free society, and I'm wondering how you feel that would come to pass.

Edit: It seems to me the likely outcome that full automation features would be optional as a feature in most cars offering the ability.
 
The driver will always be responsible...period. At least in our lifetimes.

Well that's strange since the driver is not always responsible during my lifetime. I can direct you to a ton of legal decisions where the driver was found not to be negligent due to a failure in the mechanics of which the driver had no control, resulting in a finding of product liability.

In my opinion, until the regulatory agencies address who is liable when these systems fail (and they will) and property damage or personal injury results (and it will), most manufacturers will actually be afraid to market and push these features forward in any but the simplest and most basic ways. And that, in and of itself, may also explain why no manufacturers are fielding anything remotely like autopilot; it's not because they can't build it, it's because they don't have the temerity to expose their organizations to the liability mess that is bubbling and broiling in that cauldron, waiting for the poor soul with the fortitude (or foolishness) to ladle out some of the brew.

I strongly disagree. Tesla, Mercedes and others are pushing it out as fast as they can build it. The race is on when it comes to self-driving cars and people are fooling themselves if they think companies are purposely sitting on the sidelines due to liability concerns. These companies have insurance policies that cover product liability. Tesla pays a fortune for insurance. If Tesla gets sued over an autopilot accident, their insurer steps in to defend and indemnify. Product liability insurance is big business. No car manufacturer is running scared of lawsuits, except perhaps VW since insurance covers unintentional acts or omissions, and not intentional acts, especially fraudulent ones.

Also, it's not the passing of legislation that will slow it down. That can be done relatively fast. What will take time is getting all the old non-autonomous vehicles off the road since, like airbags, autonomous driving will eventually be mandated in even the basic car. But, as they say, be very careful what you wish for... because if you love to drive like I do, it will soon (20 years?) go the way of riding a horse down the road. Self driving will be prohibited, except in designated, very limited, recreational car driving areas only. And if you just turned 50 like me, you know that it wasn't all that long ago that kids sat on phone books in cars instead of booster seats, no one wore seat belts, and "air bags" was breathing into a paper bag the supermarket gave you (since there were no plastic bags) and then popping it to scare your family. 50 is not that old and I just turned it. I laugh when people say how long things will take since I consider myself young! and we're moving much faster with technology with each passing year. Yes, it will probably be 20 years when all the cars are self-driving but it will probably be less than 5 years when the first ones do.

I so look forward to getting in my car and driving to work, to my cabin, and just around town. I dread the day that I will have to get in my car, tell it where to go, and have it take me there. After a few drinks, and once a while at other times, of course, but it's not going that way folks. The writing is on the wall and it's the insurance industry that stands to benefit big time -- yet some here think they are concerned about liability... ha! The insurance industry is salivating at the bit about the money they will save during the transition and before legislation is hopefully imposed curbing rates due to the drop in accidents. They want self-driving now! And they will (are) the ones lobbying for the legislation. Tesla has it almost mastered already. Yes, drivers will be behind the wheel for the foreseeable future, but the insurance industry knows better than anyone else that humans make far more mistakes than machines and the faster they can get machines involved and doing more than humans, the better for insurers.

Products Liability and Driverless Cars: Issues and Guiding Principles for Legislation
 
Last edited:
I think that once autonomous driving becomes more commonplace in freeways and limited access highways, there will be a pivot point when manual driving will actually be prohibited on some sections of the roads. This will lead to increased speeds, closer distances between cars, and at some point the reaction time becomes too short for normal human drivers to cope.

People will of course be able to manually drive, just not on some roads.
 
I think that once autonomous driving becomes more commonplace in freeways and limited access highways, there will be a pivot point when manual driving will actually be prohibited on some sections of the roads. This will lead to increased speeds, closer distances between cars, and at some point the reaction time becomes too short for normal human drivers to cope.

People will of course be able to manually drive, just not on some roads.

I just cannot understand the hysteria about mandated autonomous driving; I don't follow where that logic comes from. Is it likely that there will be vehicles that are capable of autonomous driving? Sure. Is it possible that insurance companies may come to provide discounts to those who use autonomous modes so as to incentivize it? Maybe. I don't see, however, how we get from that to some minority report "I'm a puppet not allowed to drive my own car" future.

Listen, the flu shot is a great idea, but you are not required to take one. You are not required as a an ADULT to take any vaccine, in fact, and you can walk right into an Ebola hot-zone if you want to and that's just fine. Skiing the black diamond run when you are just a beginner is a bad idea, too, but there is no law banning you from doing it. It isn't a good idea to eat a big mac with fries every day, but if you want to, you can. And you can follow it with 12 ding-dongs, a 6 pack of coke, and a pint of Jim Beam - all very bad ideas that can kill you... yet nobody is keeping you from doing it just because its stupid.

My point is that things which are dangerous or bad for you don't necessarily imply that some nanny-state is going to deprive you of them just because there is the availability of something better or safer. People conflate the existence of safety mechanisms with a reduction in freedom, and they especially conflate mandates targeted at protecting the safety of minors and others with a restriction on their own personal freedoms. But these things are not one and the same. Yes, they do overlap: you cannot throw your kid in the bed of the truck with only a lawn chair to sit on and then do 70 on the interstate, and yeah, maybe that cramps your parental style... but saying it is not OK for you to endanger the life of your child is a long, long way from dictating to you that you aren't permitted to go drive an ATV at break-neck speed across the desert. You see the difference?

Listen, long before we have fully autonomous driving that anyone might think to mandate, we will have active safety systems that mitigate the risk of accidents even in cases of mind-numbing stupidity. And that will mean that truly autonomous driving will exist in the realm of convenience and economic value (e.g. a fleet of self-driving taxis vs. paying lots of drivers), not in some nebulous public safety realm.

Just my 2 cents worth.
 
I just cannot understand the hysteria about mandated autonomous driving; I don't follow where that logic comes from. Is it likely that there will be vehicles that are capable of autonomous driving? Sure. Is it possible that insurance companies may come to provide discounts to those who use autonomous modes so as to incentivize it? Maybe. I don't see, however, how we get from that to some minority report "I'm a puppet not allowed to drive my own car" future.
<snip>

Nice rant.

I wasn't talking about not being able to drive your own car. I was talking about some sections or types of roads being designated to be used exclusively for autonomous driving. Very real possibility. You can go on it, at your own risk, but you may not have the reflexes to keep up. Or you may get a fine just like if you went solo on a carpool lane today.
 
Does no one here actually like to drive?
Certainly not for my daily commute. That got very boring a long time ago.
And for a road trip, I'd rather be a passenger and be able to look around.
Im afraid of self driving cars quickly becoming mandatory.
I can't see full-autonomous mode ever being mandatory. Too many people (like you) are afraid of giving up control to allow that. But I do see two big changes in the near future:
  1. Autonomous safety features will become mandatory. All manufacturers have already committed to including automated "Emergency braking" for new cars (reference). I see crash avoidance (steering in addition to braking) being required soon after.
  2. Eventually, HOV lanes will be converted to "autonomous electric vehicle" lanes. This will be a much longer-term change, but I see this as much more likely (or at least a much earlier first-step) than having entire highways become autonomous-only.
 
To answer the original question, I expect the Model 3 to have autopilot 2.0 hardware at start of production. I think the software will be capable of fully autonomous driving on divided limited access highways at least.

I can hardly wait!

GSP
 
all very bad ideas that can kill you... yet nobody is keeping you from doing it just because its stupid.

What about bad ideas that can kill other people? You are prohibited from smoking indoors. You are required to get your car inspected. You have been and will soon again be required to vaccinate your kids. You are prohibited from riding your bike on the highway. Are those things wrong? Should you be allowed to do stupid things that endanger others?

***

Human braking reaction time is around 1/2 second at best. Autonomous driving could drop that to 1 millisecond. That represents 44 feet at 60mph.

***

In my small county, the most common arrest type is driving while under suspension. People who have lost their license (i.e. dangerous drivers) who still drive, because they need to get to work, and that is the only way. Being able to switch their cars to 'autopilot only' would be a huge win. Another problem we have is a large population of aging drivers. Autopilot on shared cars could be the thing that allows them to stay in their home.


Thank you kindly.
 
Last edited:
Here is a small bit of a post I made on another thread:

-There are some people that think that self driving cars will be mandatory within the next 30 years or so but im hoping that at least on certain vehicles like trucks or sports cars, that we will still have the option to drive when we want to, just with the computer on in the background making sure you cant crash. I feel like this would satisfy the people that need the flexibility and utility that comes with a truck or the fun of driving a sports car as well as the people that want the benefits that come with self driving cars like very little crashes. What are your thoughts on this idea?-

Now to add to that, what I'm basically saying is have it so the driver can basically "play" driver. They can control the speed and where they actually go but if they mess up any or don't react quick enough, the computer will fix it.

The obvious need for this is for sports cars, but another very valid need is the practically thing for trucks. Don't know if you all have experience in this area but I do; what about pulling a trailer? You going to teach a computer to pull a trailer? How to launch and retrieve a boat? Park the trailer exactly where you want? It's a truck so pull into your yard exactly where you want so you can unload your lumber? Pull around to my back porch so I can unload my new couch. What if you have a dirt driveway? This can also apply to any vehicle just in different ways, not just trucks.

My point is that there are many things we use our vehicles for other than to get from a to b. So we have to let the driver at least play driver just keeping it safe with computers. I can understand making self driving only, mandatory on interstates to increase speed and avoid traffic jams, but off the interstate, we have got to be able to control our vehicles.
 
  • Informative
Reactions: RubberToe
I think the issue of whether autonomous driving will be mandatory or not is a little up in the air. What is for sure is it will take a long time.

Personally I believe this timetable (all dates from today):

Level 4 autonomous driving (still needs a driver in the car) hardware/software: 3-5 years
Congressional approval for hands off driving: 6 years
Level 5 driverless cars: 10-15 years
Congressional approval: 10-15 years as well (basically in sync)
All new cars built MUST have autopilot as an option by law: 15 years

....After a certain amount of time where driverless and autopiloted cars are on the road and show that driver cars are indeed more dangerous (not a question really) this is where Congress may or may not have cars be REQUIRED to be driven autonomously if capable: 25 years

I agree with the others that the issue isn't freedom, but safety of others. You can do a dangerous act relatively freely in America as long as it doesn't endanger others. Even grabbing the car parallel, you could say that tires with some tread are mandatory because in the rain, a slick tire will hydroplane and you know the rest. This is a risk to others on the road.

Like I said, It's up in the air on the mandatory nature, but I do believe Model III will AT LEAST have Level 4 autonomy hardware. It won't work in bad conditions, off main paths, and could even disable (pullover) if it doesn't understand something, but software can turn this to Level 5 over the years.....hopefully :)
 
Nice rant.

I wasn't talking about not being able to drive your own car. I was talking about some sections or types of roads being designated to be used exclusively for autonomous driving. Very real possibility. You can go on it, at your own risk, but you may not have the reflexes to keep up. Or you may get a fine just like if you went solo on a carpool lane today.

Read the thread - I'm responding to the general tone as a whole.

I disagree that it is a liklihood we would have any major roadways that would be autonomous only. There are multiple assumptions in that
What about bad ideas that can kill other people? You are prohibited from smoking indoors. You are required to get your car inspected. You have been and will soon again be required to vaccinate your kids. You are prohibited from riding your bike on the highway. Are those things wrong? Should you be allowed to do stupid things that endanger others?

***

Human braking reaction time is around 1/2 second at best. Autonomous driving could drop that to 1 millisecond. That represents 44 feet at 60mph.

***

In my small county, the most common arrest type is driving while under suspension. People who have lost their license (i.e. dangerous drivers) who still drive, because they need to get to work, and that is the only way. Being able to switch their cars to 'autopilot only' would be a huge win. Another problem we have is a large population of aging drivers. Autopilot on shared cars could be the thing that allows them to stay in their home.


Thank you kindly.

I agree with pretty much every point you made. My argument wasn't with the value of autonomous modes,it was with the fear which some people have that the existence of this technology will impair their ability to drive the car themselves if they wish to. I think that fear is largely unfounded; sure, there will be some fully autonomous cars. But I highly doubt we will see regulatory agencies attempt to mandate that you are not allowed to drive the vehicle. It is much more likely that the automated safety systems of the car will so reduce the likelihood of an accident that it will be difficult to crash one even if you try.
 
I agree with the others that the issue isn't freedom, but safety of others. You can do a dangerous act relatively freely in America as long as it doesn't endanger others. Even grabbing the car parallel, you could say that tires with some tread are mandatory because in the rain, a slick tire will hydroplane and you know the rest. This is a risk to others on the road.

I question the basic assumption that having a human operator of a vehicle will, in 10 years, be any riskier than having a car driving fully autonomously, for the simple fact that active safety systems will be the first systems deployed. That is, long before we have cars that can reliably autonomously drive themselves, we will have cars that can reliably keep you from stupidly smashing your car into someone else. Therefore, by the time we have fully automated cars (L5), the risk from human drivers and their errors will already have been almost entirely subjugated. One then has to question why, in such an environment, anyone would consider that absolutely depriving people of the opportunity to pilot the vehicle would even be necessary.

What I think is far more likely is that in 20 years or so, most cars will be capable of driving themselves, and the concept of operating a vehicle oneself will be seen more as a charming anachronism than as a risk to other people.
 
  • Informative
Reactions: RubberToe