Welcome to Tesla Motors Club
Discuss Tesla's Model S, Model 3, Model X, Model Y, Cybertruck, Roadster and More.
Register

Tesla Model 3 Fully Autonomous ?

This site may earn commission on affiliate links.
What I think is far more likely is that in 20 years or so, most cars will be capable of driving themselves, and the concept of operating a vehicle oneself will be seen more as a charming anachronism than as a risk to other people.

Fair enough. I do believe in such a case many people will not choose to learn how to drive and eventually self driving cars could die off naturally over a very long period of time. But yes, it makes sense that the only thing that will be mandated is that cars prevent accidents. However, I do think carpool lanes could potentially be replaced by autonomous only lanes with higher speed limits (human drivers would just mess up the flow of traffic).
 
How does the AP decide course of action given a "no win" situation?

Something/someone is going to get hit. Take the hit head-on since the car knows it can survive since the crash is determined to be within it's safety parameters? Hard veer and potentially collide with a non-involved car? Hard brake and potentially be rear-ended (not due to someone following too close but because the reaction by AP is so quick, the human driver behind you cannot follow suit?)

How will it make these decisions? Dog crosses the road...wreck the car or hit the dog?
 
How does the AP decide course of action given a "no win" situation?

Way better than a human. Generally slamming on the brakes is the best course of action. People swerving to avoid collisions generally causes more and worse accidents (including turning a fender bender into a fatal accident) than it solves. AP will have tons of safety data to evaluate to determine the safest course of action, versus a human's generally flawed instinct.
 
Read the thread - I'm responding to the general tone as a whole.

Check again. It sure looked like you responded directly to me, as you quoted me first.

I disagree that it is a liklihood we would have any major roadways that would be autonomous only. There are multiple assumptions in that

It's like someone saying 100+ years ago that it's unlikely that there will be motorways dedicated for cars only: "we would still need roads for horse carriages also!"

As I said earlier, it's likely to be a very gradual process. Technology is constantly being refined. It will happen in certain sections of the roads, certain lanes, little by little, until it's being expanded to more roads.

My grand-grandchildren could be the last generation of drivers. For their children seeing someone manually driving on a major road may look weird. That's not to say that they couldn't get a license if they wanted to as a hobby or go off-roading, just that they may not need to drive to live a productive life. Just like cars replaced horses a few generations before.
 
48313c3198c8f5bf05da65687538e2cc.jpg
 
How does the AP decide course of action given a "no win" situation?
How will it make these decisions? Dog crosses the road...wreck the car or hit the dog?
Given time to think it over, how would you make that decision?

Personally, I wouldn't consider it "no win", I would consider it "minimize injury & damage as best you can". Hit the brakes to avoid the dog, and hope the car behind is paying attention and can slow down to avoid hitting you.

Now pretend you're a programmer who has thought about all of these scenarios. You program the car to follow the same decision making process.
 
Now pretend you're a programmer who has thought about all of these scenarios. You program the car to follow the same decision making process.

The Tesla autopilot is being trained at the rate of Million miles a day (which I find dubious) to solve all driving problems. Even so, it is not as if we would have no input if it was all being done by programmers. AP will make the choice that we want it to, because we programmed it to.

Hard brake and potentially be rear-ended (not due to someone following too close but because the reaction by AP is so quick, the human driver behind you cannot follow suit?)

Not an issue. Humans respond to brake lights the same whether the car in front is auto-piloted or human driven. If they rear-end the car, they were following too closely. No different than now.

Thank you kindly.
 
  • Like
Reactions: GSP
Listen, long before we have fully autonomous driving that anyone might think to mandate, we will have active safety systems that mitigate the risk of accidents even in cases of mind-numbing stupidity. And that will mean that truly autonomous driving will exist in the realm of convenience and economic value (e.g. a fleet of self-driving taxis vs. paying lots of drivers), not in some nebulous public safety realm.

This is great stuff here. I had previously supposed that once autonomous cars were proven much safer than normal cars, that laws might mandate that humans no longer be allowed to drive. Kind of how the NHTSA mandated forward collision avoidance on all cars starting in 2022. For safety reasons.

What Tupper is essentially saying is that if you have a car that is proven to be 100x safer than a human driver when running autonomously, it is by default the ultimate "back seat driver". You no longer need to have a law that says "humans cant drive", you just have a law that says that humans can drive, but the autonomy system must be running and monitoring what is going on and take over to prevent anything really bad from happening. So conceptually it will let you drive 100mph, but not let you intentionally or unintentionally rear end someone at that speed.

To get back on topic, I don't believe that the Model 3 will have any features more advanced than the S and X. Those are more expensive cars, so anything new that will go on the 3 will show up on the S and X in a hardware refresh prior to the debut of the 3. So maybe late 2017 we see the hardware show up on the S and X that supports essentially "full autonomy", and the same hardware also being on the 3, either on the base model, or more likely as a pricey option. Then, with all the hardware present, Tesla "turns on" capabilities as both the underlying software support it, and the legal framework allows it.

This gets back to the quote from Elon (I believe) where he said they didn't need Lidar. That they could do it all with radar, cameras, and software. That would obviously be much cheaper, but you just need to be absolutely certain that you get the hardware platform correct from the start, so you don't need another hardware refresh to get to full autonomy.

RT
 
This gets back to the quote from Elon (I believe) where he said they didn't need Lidar. That they could do it all with radar, cameras, and software. That would obviously be much cheaper, but you just need to be absolutely certain that you get the hardware platform correct from the start, so you don't need another hardware refresh to get to full autonomy.

RT

My thoughts and hopes exactly. I believe I read somewhere they have 100 Hardware engineers to the 50 software engineers for AP. First comes first so hopefully they'll have Level 4 hardware in the car and later we just need updates. Comon MOBILEYE :)
 
So conceptually it will let you drive 100mph, but not let you intentionally or unintentionally rear end someone at that speed.

Tupper makes a good point. But first, I suspect when regulators have control over the limits on how you can drive, not exceeding the speed limit will be near the top of the list. Second, even if not, 'real drivers' won't accept even Tupper's reasonable compromise. They (I think) want the risk. It depends on how much the believe the system, I guess, they might be able to convince themselves that there is still a risk.

Thank you kindly.
 
My point is that things which are dangerous or bad for you don't necessarily imply that some nanny-state is going to deprive you of them just because there is the availability of something better or safer. People conflate the existence of safety mechanisms with a reduction in freedom, and they especially conflate mandates targeted at protecting the safety of minors and others with a restriction on their own personal freedoms.
Sorry, but this has the feel of a strawman. Do you accept it is illegal to drive 100 mph through town? Do you find it acceptable that most governments require that your car have working brakes if you are going to drive it on public roads? Aren't they restrictions on your right to drive as dangerously as you wish?

Right now even after the tremendous decrease in the death by automobile over the past 30 years in the US, it is still one the top ten causes of death for every age group except those under 1. Autonomous driving will cut that by 80 or 90 percent. It will be mandated not to protect you from yourself, but to protect everyone else on or near roadways from YOU.
 
Sorry, but this has the feel of a strawman. Do you accept it is illegal to drive 100 mph through town? Do you find it acceptable that most governments require that your car have working brakes if you are going to drive it on public roads? Aren't they restrictions on your right to drive as dangerously as you wish?

Right now even after the tremendous decrease in the death by automobile over the past 30 years in the US, it is still one the top ten causes of death for every age group except those under 1. Autonomous driving will cut that by 80 or 90 percent. It will be mandated not to protect you from yourself, but to protect everyone else on or near roadways from YOU.

It's not a straw-man argument at all. Please note that I never stated anything about blatantly unsafe driving practices. I merely pointed out that unsafe human behaviors don't somehow of necessity result in the abolition of them. Regardless, this is irrelevant to the point.

My suggestion is that the "fully autonomous or nothing" argument is flawed - it represents a false dichotomy. Such an argument boils down to the premise that we cannot make transport safe without completely removing human operators from the equation, and I think that this premise is flawed and that as an idea it exists in tension with many complicating factors, one being that it fundamentally disregards the needs people have for a sense of personal agency. It is thus both irrational as an argument, being based on a false premise... and also in conflict with human nature.

In simple terms, any vehicle capable of autonomously navigating the roads safely while dealing with pedestrians, non automated transport (bicycles, etc) and all the other road hazards that can arise... Is a system equally capable of allowing a human operator a level of control while simultaneously preventing accidents. Indeed, those sorts of active accident avoidance systems will be standard on most cars long before true autonomous behaviors are, and as a result the vast majority of safety gains will actually accrue long before the advent of L4 autonomy.

I don't think safety is somehow uniquely tied to full autonomy... or that it cannot be achieved without exclusively autonomous modes. IMO that's not a logical premise at all, and is more properly seen as a reflection of bias.
 
Last edited:
  • Like
Reactions: GSP and jkk_
This is great stuff here. I had previously supposed that once autonomous cars were proven much safer than normal cars, that laws might mandate that humans no longer be allowed to drive. Kind of how the NHTSA mandated forward collision avoidance on all cars starting in 2022. For safety reasons.

What Tupper is essentially saying is that if you have a car that is proven to be 100x safer than a human driver when running autonomously, it is by default the ultimate "back seat driver". You no longer need to have a law that says "humans cant drive", you just have a law that says that humans can drive, but the autonomy system must be running and monitoring what is going on and take over to prevent anything really bad from happening. So conceptually it will let you drive 100mph, but not let you intentionally or unintentionally rear end someone at that speed.

To get back on topic, I don't believe that the Model 3 will have any features more advanced than the S and X. Those are more expensive cars, so anything new that will go on the 3 will show up on the S and X in a hardware refresh prior to the debut of the 3. So maybe late 2017 we see the hardware show up on the S and X that supports essentially "full autonomy", and the same hardware also being on the 3, either on the base model, or more likely as a pricey option. Then, with all the hardware present, Tesla "turns on" capabilities as both the underlying software support it, and the legal framework allows it.

This gets back to the quote from Elon (I believe) where he said they didn't need Lidar. That they could do it all with radar, cameras, and software. That would obviously be much cheaper, but you just need to be absolutely certain that you get the hardware platform correct from the start, so you don't need another hardware refresh to get to full autonomy.

RT
See this is exactly what im talking about! I completely agree 100% with this guy. You should be able to drive, but the computer watching over you to protect you. If you think about it will be kinda cool. You can drive but don't have too worry about damaging your pride and joy of a car. Its possible we could even be able to go much faster and enjoy high speed driving because the computer will make it completely safe.
 
This is true if you're only looking at safety. However, autonomous driving allows other benefits, such as higher speeds at closer vehicle distances, etc. I believe that there will be a point where human drivers (I'm talking about average folks, not experienced racecar drivers), due to slower reflexes, will be incapable of being in the same traffic flow with autonomous cars, and if they are they would seriously impede it. For this reason I suggested earlier that there will be sections, or lines, on the roadway that will be designated for the exclusive use of autonomous driving.
 
I see so many people saying this will never happen due to regulations...so many people are also so convinced it won't happen anytime soon. Do you even read studies on the subject? Do you know what the law of accelerating returns is? These sort of technologies don't take decades to mature anymore, rather just years. We're not in the 1900's anymore.

Do some research on just Google's progress alone. It's incredible what the software and hardware is already doing, TODAY. These cars are driving thousands of miles without needing intervention, up from hundreds of miles just a few months ago, not even years. This is on regular roads, even dirt roads! Within a few years that number will be past hundreds of thousands of miles without needing intervention. Whether or not you choose to accept it, the reality is we are VERY close to these systems being able to drive fully autonomously "generally" full-time.

Have all states passed legislation that allows cars to drive themselves fully autonomously on highways?? Are Tesla cars doing this right now as we speak? They can simply integrate the full autopilot tech into the Model 3 with a stipulation that you must be alert at all times and be ready to intervene. They made the same exact warning for autopilot, yet you see how well it works already.

Don't fight the future... embrace it!
 
Last edited:
  • Like
Reactions: WarpedOne
I fully support the implementation of autonomous driving and new technology. I do hope though that as this technology is implemented there is always room for human operators to drive vehicles of their choice, with or without a computer's monitoring. You can see my avatar. I would hate to be told I could no longer enjoy driving her from time to time solely based on the fact that the technology required to operate on these modern roads didn't exist in 1969 when she was built. There has to be middle ground somewhere.

Dan
 
How does the AP decide course of action given a "no win" situation?

Something/someone is going to get hit. Take the hit head-on since the car knows it can survive since the crash is determined to be within it's safety parameters? Hard veer and potentially collide with a non-involved car? Hard brake and potentially be rear-ended (not due to someone following too close but because the reaction by AP is so quick, the human driver behind you cannot follow suit?)

How will it make these decisions? Dog crosses the road...wreck the car or hit the dog?


Ummmm, if you slam into a car in front of you, you were following too close. Doesn't matter how quickly the autopilot can hit the breaks, if you're giving yourself sufficient time to stop by spacing properly given the current driving conditions, then it isn't an issue. Chain car wrecks happen today because of multiple drivers tailgating or driving too closely, no autopilot required.